Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

kill the running jobs on the cluster when I hit ctrl-C in the CLI #162

Open
xelax opened this issue Oct 18, 2012 · 3 comments
Open

kill the running jobs on the cluster when I hit ctrl-C in the CLI #162

xelax opened this issue Oct 18, 2012 · 3 comments
Milestone

Comments

@xelax
Copy link
Contributor

xelax commented Oct 18, 2012

when I quit the scoobi "launcher" program in the CLI gateway machine I quit the launcher but the hadoop job keeps running and I need to go and kill it manually. Ideally it would be nice if scoobi would catch the ctrl-C and kill the hadoop job before quitting.

@espringe
Copy link
Contributor

Isn't this the behavior of hadoop normally? But I agree that if you abort a scoobi project, killing the job makes sense -- as a half-finished scoobi job isn't useful for anyone. But what about the case it's executing the last map-reduce job of the pipeline? In that case, it would seem to match semantics to let it run until completion ?

@xelax
Copy link
Contributor Author

xelax commented Oct 22, 2012

well, if i decide to hit ctrl-C it means that I just want to stop the computation, so I'd keep ti simple and simply kill all the hadoop jobs. Is not uncommon to realize that you started a job with some fatal flaw.

@raronson
Copy link
Contributor

raronson commented Jan 8, 2013

Just wanted to add that Pig has this behavior - killing any running M/R jobs when you hit ctrl-C.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants