Run multiple exec commands at once (But wait for the last one to finish)

Asked
Active3 hr before
Viewed126 times

8 Answers

multiple
90%

Meta Stack Overflow ,Stack Overflow en español,Stack Overflow em Português, Stack Overflow Public questions & answers

Script being run in parallel:

// waitAndDate.php

<
? php
sleep((int) $_GET['time']);
printf('%d secs; %s', $_GET['time'], shell_exec('date'));

Script making calls in parallel:

// multiExec.php

<
? php
$start = microtime(true);

$mh = curl_multi_init();
$handles = array();

// create several requests
for ($i = 0; $i < 5; $i++) {
   $ch = curl_init();

   $rand = rand(5, 25); // just making up data to pass to script
   curl_setopt($ch, CURLOPT_URL, "http://domain/waitAndDate.php?time=$rand");
   curl_setopt($ch, CURLOPT_HEADER, 0);
   curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
   curl_setopt($ch, CURLOPT_TIMEOUT, 30);

   curl_multi_add_handle($mh, $ch);
   $handles[] = $ch;
}

// execute requests and poll periodically until all have completed
$isRunning = null;
do {
   curl_multi_exec($mh, $isRunning);
   usleep(250000);
} while ($isRunning > 0);

// fetch output of each request
$outputs = array();
for ($i = 0; $i < count($handles); $i++) {
   $outputs[$i] = trim(curl_multi_getcontent($handles[$i]));
   curl_multi_remove_handle($mh, $handles[$i]);
}

curl_multi_close($mh);

print_r($outputs);
printf("Elapsed time: %.2f seconds\n", microtime(true) - $start);

Here is some output I received when running it a few times:

Array
   (
      [0] => 8 secs; Mon Apr 2 19: 01: 33 UTC 2012[1] => 8 secs; Mon Apr 2 19: 01: 33 UTC 2012[2] => 18 secs; Mon Apr 2 19: 01: 43 UTC 2012[3] => 11 secs; Mon Apr 2 19: 01: 36 UTC 2012[4] => 8 secs; Mon Apr 2 19: 01: 33 UTC 2012
   )
Elapsed time: 18.36 seconds

Array
   (
      [0] => 22 secs; Mon Apr 2 19: 02: 33 UTC 2012[1] => 9 secs; Mon Apr 2 19: 02: 20 UTC 2012[2] => 8 secs; Mon Apr 2 19: 02: 19 UTC 2012[3] => 11 secs; Mon Apr 2 19: 02: 22 UTC 2012[4] => 7 secs; Mon Apr 2 19: 02: 18 UTC 2012
   )
Elapsed time: 22.37 seconds

Array
   (
      [0] => 5 secs; Mon Apr 2 19: 02: 40 UTC 2012[1] => 18 secs; Mon Apr 2 19: 02: 53 UTC 2012[2] => 7 secs; Mon Apr 2 19: 02: 42 UTC 2012[3] => 9 secs; Mon Apr 2 19: 02: 44 UTC 2012[4] => 9 secs; Mon Apr 2 19: 02: 44 UTC 2012
   )
Elapsed time: 18.35 seconds
load more v
88%

I need to run all of the execs at once and I need to wait for them all to complete before returning. I also need the output of all of the scripts stored in the array called $execout.,But if you want to wait that ALL execs are finished before continuing, you have to make a call back from the external script. Maybe like this : ,I have information that is passed in to my function via a _POST request. Based on that data, I run an exec command to run a TCL script a certain number of times (with different parameters, based on the post variable). Right now, I have the exec in a foreach so this takes forever to run (the TCL script takes 15 or so seconds to come back, so if I need to run it 100 times, I have a bit of an issue). Here is my code:,I've looked around for this and I can't seem to find anyone who is trying to do exactly what I am.

I have information that is passed in to my function via a _POST request. Based on that data, I run an exec command to run a TCL script a certain number of times (with different parameters, based on the post variable). Right now, I have the exec in a foreach so this takes forever to run (the TCL script takes 15 or so seconds to come back, so if I need to run it 100 times, I have a bit of an issue). Here is my code:

    public
    function executeAction() {
       //code to parse the _POST variable into an array called devices

       foreach($devices as $devID) {
          exec("../path/to/script.tcl -parameter1 ".$device['param1'].
             " -parameter2 ".$device['param2'], $execout[$devID]);
       }
       print_r($execout);
    }
load more v
72%

The first three commands wget commands will be executed in parallel. "wait" will make the script wait till those 3 gets finished. Once it is finished, the script will simultaneously run the next 6 commands, and wait till it completes and so on.,You can confirm all these commands are being executed simultaneously using another shell, and see the process list (in our case it should show 3 wget commands, with three different processes).,Now let us ask parallel to execute all the commands in that file simultaneously. This can be done as shown below. ,Notice the & towards the end of each command. This will put the command in background, and execute the next (put this in background) and proceed to the next and so on.

 

root @instance - 10: /tmp# cat test_parallel.sh
#!/bin/bash
wget https: //storage.googleapis.com/test-bucket-sarath/junkfile1 &
   wget https: //storage.googleapis.com/test-bucket-sarath/junkfile2 &
   wget https: //storage.googleapis.com/test-bucket-sarath/junkfile3 &
load more v
65%

You need to run several commands, but some take a while and you don’t want to wait for the last one to finish before issuing the next command.,Assume that we want to run three commands: long, medium, and short, each of whose execution time is reflected in its name. We need to run them in that order, but don’t want to wait around for long to finish before starting the other commands. We could use a shell script (aka batch file). Here’s a primitive way to do that:,The third, and arguably best, solution is to run each command in sequence. If you want to run each program, regardless if the preceding ones fail, separate them with semicolons:,Another rather simple solution is to type those commands into a file and then tell bash to execute the commands in the file—i.e., a simple shell script.

$ cat > simple.script
long
medium
short
   ^
   D # Ctrl - D, not visible
$ bash. / simple.script
load more v
75%

The commands within each group run in parallel, and the groups run sequentially, each group of parallel commands waiting for the previous group to finish before starting execution., 1 @goro, Is this accurate? "You have several groups of commands. Within each group, the commands should run in parallel (at the same time). The groups should run sequentially, waiting for one group to finish before starting the next group." – RobertL Nov 10 '15 at 11:43 ,Should work (each individual triplet component will run sequentially, but hte groups will run in parallel). You probably don't want your parent shell to exit before the groups have finished -- hence the wait.,Assume 3 groups of commands as in the code below. In each group the three commands are started in the background with &.

After all three commands in the the third group exit, command 10 will execute.

$ cat command_groups.sh
#!/bin/sh

command() {
   echo $1 start
   sleep $(($1 & 03)) # keep the seconds value within 0 - 3
   echo $1 complete
}

echo First Group:
   command 1 &
   command 2 &
   command 3 &
   wait

echo Second Group:
   command 4 &
   command 5 &
   command 6 &
   wait

echo Third Group:
   command 7 &
   command 8 &
   command 9 &
   wait

echo Not really a group, no need
for background / wait:
   command 10

$ sh command_groups.sh
First Group:
   1 start
2 start
3 start
1 complete
2 complete
3 complete
Second Group:
   4 start
5 start
6 start
4 complete
5 complete
6 complete
Third Group:
   7 start
8 start
9 start
8 complete
9 complete
7 complete
Not really a group, no need
for background / wait:
   10 start
10 complete
$
load more v
40%

If instead you just want to run a command in a particular directory, you can do that without the shell. You can set the current working directory to execute the command like so:,Your code seems fine as I am able to run the below script on my machine:,This will show you the current time and then pause, leaving the command prompt open. Give this a go as a tester as it works fine for me.,I'm having trouble figuring out how to run multiple commands using the os/exec package. I've trolled the net and stackoverflow and haven't found anything that works for me case. Here's my source:

I'm having trouble figuring out how to run multiple commands using the os/exec package. I've trolled the net and stackoverflow and haven't found anything that works for me case. Here's my source:

package main

import(
   _ "bufio"
   _ "bytes"
   _ "errors"
   "fmt"
   "log"
   "os"
   "os/exec"
   "path/filepath"
)

func main() {
   ffmpegFolderName: = "ffmpeg-2.8.4"
   path,
   err: = filepath.Abs("")
   if err != nil {
      fmt.Println("Error locating absulte file paths")
      os.Exit(1)
   }

   folderPath: = filepath.Join(path, ffmpegFolderName)

   _,
   err2: = folderExists(folderPath)
   if err2 != nil {
      fmt.Println("The folder: %s either does not exist or is not in the same directory as make.go", folderPath)
      os.Exit(1)
   }
   cd: = exec.Command("cd", folderPath)
   config: = exec.Command("./configure", "--disable-yasm")
   build: = exec.Command("make")

   cd_err: = cd.Start()
   if cd_err != nil {
      log.Fatal(cd_err)
   }
   log.Printf("Waiting for command to finish...")
   cd_err = cd.Wait()
   log.Printf("Command finished with error: %v", cd_err)

   start_err: = config.Start()
   if start_err != nil {
      log.Fatal(start_err)
   }
   log.Printf("Waiting for command to finish...")
   start_err = config.Wait()
   log.Printf("Command finished with error: %v", start_err)

   build_err: = build.Start()
   if build_err != nil {
      log.Fatal(build_err)
   }
   log.Printf("Waiting for command to finish...")
   build_err = build.Wait()
   log.Printf("Command finished with error: %v", build_err)

}

func folderExists(path string)(bool, error) {
   _,
   err: = os.Stat(path)
   if err == nil {
      return true, nil
   }
   if os.IsNotExist(err) {
      return false, nil
   }
   return true,
   err
}

I want to the command like I would from terminal. cd path; ./configure; make So I need run each command in order and wait for the last command to finish before moving on. With my current version of the code it currently says that ./configure: no such file or directory I assume that is because cd path executes and in a new shell ./configure executes, instead of being in the same directory from the previous command. Any ideas? UPDATE I solved the issue by changing the working directory and then executing the ./configure and make command

err = os.Chdir(folderPath)
if err != nil {
   fmt.Println("File Path Could not be changed")
   os.Exit(1)
}
load more v
22%

Both formats fail. How can I run multiple commands which have & in one command line?, 4 (myCommand1 &) && (myCommand2 &) will run myCommand2 even if myCommand1 failed. – terdon Dec 29 '17 at 9:15 ,But if I want to run multiple commands in background, I tried the following command format, but failed:,wait waits for a background process to be finished and returns its termination status:

If you want to run them sequentially:

(myCommand1; myCommand2) &

or

(myCommand1 & ) && (myCommand2 & )

If you want them to run parallel:

myCommand1 & myCommand2 &

In bash you can also use this (space behind the { and the ; are mandatory):

{
   myCommand1 && myCommand2;
} &
load more v
60%

The last line from the result of the command. If you need to execute a command and have all the data from the command passed directly back without any interference, use the passthru() function. , The command that will be executed. , To get the output of the executed command, be sure to set and use the output parameter. , exec() executes the given command.

Returned with status 0 and output:
   Array(
      [0] => cmb
   )

Other "multiple-undefined" queries related to "Run multiple exec commands at once (But wait for the last one to finish)"