Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have a bunch of Python scripts that run a various intervals (hourly, daily, monthly, etc). I have trouble keeping track of which jobs fail -- unless I notice an email doesn't show up, or a metric is missing. Is Dcron something I would use to keep track of my cron jobs and whether or not they ran successfully?

At this point I don't need my jobs to be distributed, I'm just looking for a way to keep track of them all and visualize characteristics of the job (start time, length, successful, etc). Ideally, I'd have a way to re-run the job within the UI.



Disclosure: I work for Iron.io

If you're open to a hosted solution, check out IronWorker: http://www.iron.io/worker

It's an async task processing service with a built-in job scheduler. You can upload your python scripts to Iron.io, then set schedules and other triggers to execute on-demand. We have a dashboard to manage tasks and schedules, see what ran and what failed, and you can visualize the characteristics you're looking for. We do distribute the workloads for you, but sounds like it could be a good fit.


You can also try cronitor and others like it. You add a CURL command at the end of your script that 'touches' a HTTPS endpoint. If your script doesn't check-in at pre-defined intervals, you get alerted. They are perfect for situations where you don't actually want to set up any infrastructure.


Hey,

I've been tinkering with this concept recently, it's not quite ready for beta testing, but I'd be happy to take feedback on what I've got so far: http://croncloud.io/


Sure, it can do this well. It can run in a single dcron/etcd node if you don't need it to be fault tolerant. At this point you should be checking the job status in the ui as it doesn't have notifications yet.


Check out Airflow (open source project from airbnb that does all of that very well).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: