* Re: a command "spool", ie queueing commands?
[not found] ` <1122969644l.1939l.0l@hyttynen>
@ 2005-08-02 13:14 ` asterr
2005-08-04 0:57 ` Emiliano Castagnari
1 sibling, 0 replies; 6+ messages in thread
From: asterr @ 2005-08-02 13:14 UTC (permalink / raw)
To: urgrue; +Cc: Adrian C., linux-admin
See the batch command.
On Tue, 2 Aug 2005, urgrue wrote:
> > Could you please give us a real-life example?
>
> Well, for example imagine you're doing some maintenance on a server.
> You want to move some big files, run updatedb by hand, run a script
> that parses a bunch of files, and finally merge some video files.
>
> Because these are very disk-heavy operations, it's much better and
> quicker if they are not run at once, but are run sequentially instead.
>
> So normally you'd probably use "at" to run one now, another in an hour,
> etc. But this is dumb because you have to simply try and guess if the
> job is done or not. You could also run them in sequence by putting them
> all in the same "at" job, but its not possible to add or remove
> commands from an "at" job once its created. Also, "at" is "user-aware",
> ie if two users set up two at jobs they will happily run at the same
> time. What I propose would add all commands to the same queue, even if
> a different user is the one launching the command (but of course the
> commands should be run AS the user who started it).
>
> So what I'm thinking is like this (lets assume the command is called
> "cs" for "command spooler"):
> me:~> cs
> Usage: cs [add|remove|show] <command>
> me:~> cs show
> no entires
> me:~> cs add mv /path/hugefiles /other/path/hugefiles
> command added to queue
> me:~> cs show
> 1 [running] : mv /path/hugefiles /other/path/hugefiles
> me:~> cs add updatedb
> command added to queue
> me:~> cs add /path/my_file_parsing_script /path/lots/of/files
> command added to queue
> me:~> cs add avimerge -i clip1.avi clip2.avi clip3.avi -o allclips.avi
> command added to queue
> me:~> cs show
> 1 [running] : mv /path/hugefiles /other/path/hugefiles
> 2 [queued] : updatedb
> 3 [queued] : path/my_file_parsing_script /path/lots/of/files
> 4 [queued] : avimerge -i clip1.avi clip2.avi clip3.avi -o allclips.avi
>
>
> As I do lots of disk-heavy operations, I would find this INCREDIBLY
> useful.
>
> urgrue
>
>
> > --Adrian.
> >
> > On 8/2/05, urgrue <urgrue@tumsan.fi> wrote:
> > > i realized it would be useful to be able to add commands into a
> > command
> > > queue, from where they would get executed in sequence. for example
> > > numerous large hard disk-intensive operations would be better
> > executed
> > > in sequence rather than at once.
> > > in other words, exactly like a printer spool, but for commands. you
> > > could add commands in, list the queue, and remove from the queue.
> > >
> > > does anyone know of something like this?
> > >
> > >
> > > -
> > > To unsubscribe from this list: send the line "unsubscribe
> > linux-admin" in
> > > the body of a message to majordomo@vger.kernel.org
> > > More majordomo info at http://vger.kernel.org/majordomo-info.html
> > >
> >
> >
>
>
> -
> To unsubscribe from this list: send the line "unsubscribe linux-admin" in
> the body of a message to majordomo@vger.kernel.org
> More majordomo info at http://vger.kernel.org/majordomo-info.html
>
^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: a command "spool", ie queueing commands?
[not found] ` <1122969644l.1939l.0l@hyttynen>
2005-08-02 13:14 ` a command "spool", ie queueing commands? asterr
@ 2005-08-04 0:57 ` Emiliano Castagnari
2005-08-04 7:26 ` urgrue
1 sibling, 1 reply; 6+ messages in thread
From: Emiliano Castagnari @ 2005-08-04 0:57 UTC (permalink / raw)
To: urgrue; +Cc: Adrian C., linux-admin
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
El [ Tue 02, Aug 05 - 08:00 ] , urgrue expreso:
> >Could you please give us a real-life example?
>
> Well, for example imagine you're doing some maintenance on a server.
> You want to move some big files, run updatedb by hand, run a script
> that parses a bunch of files, and finally merge some video files.
>
> Because these are very disk-heavy operations, it's much better and
> quicker if they are not run at once, but are run sequentially instead.
>
> So normally you'd probably use "at" to run one now, another in an hour,
> etc. But this is dumb because you have to simply try and guess if the
> job is done or not. You could also run them in sequence by putting them
> all in the same "at" job, but its not possible to add or remove
> commands from an "at" job once its created. Also, "at" is "user-aware",
> ie if two users set up two at jobs they will happily run at the same
> time. What I propose would add all commands to the same queue, even if
> a different user is the one launching the command (but of course the
> commands should be run AS the user who started it).
Hi !!
You could try something like:
$ command1 && command2 && command3
&& is a meta character. If the return code of the last command executed
evaluates to true, it will execute the next command.
It is a bash an sh meta-character, I don't know if it'll work on other shells.
The only drawback I can see, is that if any of this commands finish with an
error, that will be the end of your queue :(
Another way, is to simply develop it ... it doesn't seem like a very hard one
to code :)
Cheers !!
- --
# Emiliano Castagnari
[ -========================================================================- ]
('> # Debian Sarge - GNU/Linux - lambda 2.6.10-l01 #
Gnu / (/) # JID: torian@lug.fi.uba.ar
'-- - } [ Libera tu mente - Libera tu Codigo ] { -
[ -========================================================================- ]
[ GPGKey: http://gpg.lug.fi.uba.ar:11371/pks/lookup?op=get&search=0x0B8AF76F ]
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.5 (GNU/Linux)
iD8DBQFC8WfmbsFzaguK928RAuTOAKC3YJiRnOc1jU4aZC5VFyz0N38PgACgpxWT
Ac1nvXdKaiwMSAWhzSyWFmc=
=h9Lm
-----END PGP SIGNATURE-----
^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: a command "spool", ie queueing commands?
2005-08-04 0:57 ` Emiliano Castagnari
@ 2005-08-04 7:26 ` urgrue
2005-08-04 11:16 ` Glynn Clements
0 siblings, 1 reply; 6+ messages in thread
From: urgrue @ 2005-08-04 7:26 UTC (permalink / raw)
To: linux-admin
On 08/04/2005 03:57:10 AM, Emiliano Castagnari wrote:
> You could try something like:
>
> $ command1 && command2 && command3
Better would be command1 ; command2 ; command3, because then it won't
depend on the previous command coming out true.
But nevertheless it lacks the ability for me to add and remove commands
from the queue after it's started, which is almost the main feature
that I need.
Basically it would be good enough if "at" could do "at -next", which
would run the job after any that were already running completed.
The "batch" command is good but bases whether or not to do a job on cpu
load levels, whereas I'm looking for the same but for disk load levels.
> Another way, is to simply develop it ... it doesn't seem like a very
> hard one to code :)
This may be true. It might be easiest to do it as an addition to "at",
since all the spooling logic and daemon is already there.... I can't
code worth a damn, but maybe I could swing altering "batch" to monitor
disk load levels instead...if only there was a general disk load
monitor that was as reliable as loadavg...
Or perhaps "at -next" would be easier. Instead of looking at the time,
look if a job is running - if one is, postpone the job by a few
minutes.
^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: a command "spool", ie queueing commands?
2005-08-04 7:26 ` urgrue
@ 2005-08-04 11:16 ` Glynn Clements
2005-08-04 16:27 ` command spooling tangent Jim Roy
0 siblings, 1 reply; 6+ messages in thread
From: Glynn Clements @ 2005-08-04 11:16 UTC (permalink / raw)
To: urgrue; +Cc: linux-admin
urgrue wrote:
> > You could try something like:
> >
> > $ command1 && command2 && command3
>
> Better would be command1 ; command2 ; command3, because then it won't
> depend on the previous command coming out true.
Well, whether or not it's better depends upon whether you want to
conditionalise execution of one command upon success of the previous
command.
I find it far more common for scripts to use ";" when they really
should have used "&&" than the converse. A particularly common case
is "cd foo ; dosomething"; this should almost always be using "&&".
--
Glynn Clements <glynn@gclements.plus.com>
^ permalink raw reply [flat|nested] 6+ messages in thread
* command spooling tangent
2005-08-04 11:16 ` Glynn Clements
@ 2005-08-04 16:27 ` Jim Roy
0 siblings, 0 replies; 6+ messages in thread
From: Jim Roy @ 2005-08-04 16:27 UTC (permalink / raw)
To: Glynn Clements; +Cc: linux-admin
On Thu, 4 Aug 2005 12:16:12 +0100
Glynn Clements <glynn@gclements.plus.com> wrote:
>
> I find it far more common for scripts to use ";" when they really
> should have used "&&" than the converse. A particularly common case
> is "cd foo ; dosomething"; this should almost always be using "&&".
>
guilty as charged :-) Try for example, the old standard:
tar -cf - * | ( cd targetdir; tar -xf - )
where targetdir doesn't actually exist. I don't THINK anything
bad happened when I did this, but I haven't actually looked
at all the images I was trying to move either.
Jim Roy
^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: a command "spool", ie queueing commands?
[not found] <1122963263l.32575l.0l@hyttynen>
[not found] ` <60a7468905080200315678d209@mail.gmail.com>
@ 2005-08-09 0:54 ` Mike Castle
1 sibling, 0 replies; 6+ messages in thread
From: Mike Castle @ 2005-08-09 0:54 UTC (permalink / raw)
To: linux-admin
In article <1122963263l.32575l.0l@hyttynen>, urgrue <urgrue@tumsan.fi> wrote:
>i realized it would be useful to be able to add commands into a command
>queue, from where they would get executed in sequence. for example
>numerous large hard disk-intensive operations would be better executed
>in sequence rather than at once.
>in other words, exactly like a printer spool, but for commands. you
>could add commands in, list the queue, and remove from the queue.
>
>does anyone know of something like this?
GNU Queue is one such beast, though unfortunately the released versions no
longer compile in modern environments, and it's currently undergoing an
overhaul. Find the pages on gnu.org for more details.
These days I use a set of simple bash scripts that do most of what you ask
except for the removal of commands (and you have to keep a command line
window open for each command). But then, I wrote it for me. :->
A couple of years ago I saw someone post about writing something similar in
python. I'm not sure if this ``simple job manager'' at Stanford is it or
not:
http://www.slac.stanford.edu/BFROOT/www/Computing/Distributed/Bookkeeping/SJM/SJMMain.htm
mrc
^ permalink raw reply [flat|nested] 6+ messages in thread
end of thread, other threads:[~2005-08-09 0:54 UTC | newest]
Thread overview: 6+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
[not found] <1122963263l.32575l.0l@hyttynen>
[not found] ` <60a7468905080200315678d209@mail.gmail.com>
[not found] ` <1122969644l.1939l.0l@hyttynen>
2005-08-02 13:14 ` a command "spool", ie queueing commands? asterr
2005-08-04 0:57 ` Emiliano Castagnari
2005-08-04 7:26 ` urgrue
2005-08-04 11:16 ` Glynn Clements
2005-08-04 16:27 ` command spooling tangent Jim Roy
2005-08-09 0:54 ` a command "spool", ie queueing commands? Mike Castle
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).