Jay Taylor's notes

back to listing index

shell - How do I join two named pipes into single input stream in linux - Server Fault

[web search]
Original source (serverfault.com)
Tags: bash shell-scripting pipes streams shell-tricks wizardry serverfault.com
Clipped on: 2020-02-05

Make your voice heard. Take the 2020 Developer Survey now.

Using the pipes (|) feature in Linux I can forward chain the standard input to one or several output streams.

I can use tee to split the output to separate sub processes.

Is there a command to join two input streams?

How would I go about this? How does diff work?

Personally, my favorite (requires bash and other things that are standard on most Linux distributions)

The details can depend a lot on what the two things output and how you want to merge them ...

Contents of command1 and command2 after each other in the output:

cat <(command1) <(command2) > outputfile

Or if both commands output alternate versions of the same data that you want to see side-by side (I've used this with snmpwalk; numbers on one side and MIB names on the other):

paste <(command1) <(command2) > outputfile

Or if you want to compare the output of two similar commands (say a find on two different directories)

diff <(command1) <(command2) > outputfile

Or if they're ordered outputs of some sort, merge them:

sort -m <(command1) <(command2) > outputfile

Or run both commands at once (could scramble things a bit, though):

cat <(command1 & command2) > outputfile

The <() operator sets up a named pipe (or /dev/fd) for each command, piping the output of that command into the named pipe (or /dev/fd filehandle reference) and passes the name on the commandline. There's an equivalent with >(). You could do: command0 | tee >(command1) >(command2) >(command3) | command4 to simultaneously send the output of one command to 4 other commands, for instance.

answered Aug 16 '10 at 18:28
Image (Asset 3/10) alt=
awesome! i've read bash's manpage lots of time but hadn't pick that one – Javier Aug 16 '10 at 21:12
  • 2
    You can find the reference in the [advanced bash scripting guide] (tldp.org/LDP/abs/html/process-sub.html) at the linux documentation project – brice Jul 8 '11 at 15:50
  • 3
    i was able to prevent interleaved lines by piping through grep --line-buffered - handy for concurrently grep'ing the tail of multiple log files. see stackoverflow.com/questions/10443704/line-buffered-cat – RubyTuesdayDONO Apr 8 '13 at 20:47
  • 16

    You can append two steams to another with cat, as gorilla shows.

    You can also create a FIFO, direct the output of the commands to that, then read from the FIFO with whatever other program:

    mkfifo ~/my_fifo
    command1 > ~/my_fifo &
    command2 > ~/my_fifo &
    command3 < ~/my_fifo
    

    Particularly useful for programs that will only write or read a file, or mixing programs that only output stdout/file with one that supports only the other.

    answered Aug 16 '10 at 17:43
    Image (Asset 4/10) alt=
    This one works on pfSense (FreeBSD) whereas the accepted answer does not. Thank you! – Nathan Jul 7 '16 at 15:05
    9
    (tail -f /tmp/p1 & tail -f /tmp/p2 ) | cat > /tmp/output
    

    /tmp/p1 and /tmp/p2 are your input pipes, while /tmp/output is the output.

    Image (Asset 5/10) alt=
    Note: Unless both commands in side the () flush their output on every line (and some other obscure POSIX rules for atomicity), you could end up with some weird scrambling on the input to cat ... – freiheit Aug 16 '10 at 18:41
  • Should you not be using semicolon instead of ampersand character? – Samir Jul 4 '15 at 23:38
  • this is Epic stuff – Mobigital May 22 '19 at 0:52
  • 5

    I have created special program for this: fdlinecombine

    It reads multiple pipes (usually program outputs) and writes them to stdout linewise (you can also override the separator)

    answered Jul 26 '12 at 0:09
    Image (Asset 6/10) alt=
    Works as advertized. Thank you for making it public. – alexei Mar 13 '15 at 22:49
    3

    A really cool command I have used for this is tpipe, you might need to compile because it not that common. Its really great for doing exactly what your talking about, and it's so clean I usually install it. The man page is located here http://linux.die.net/man/1/tpipe . The currently listed download is at this archive http://www.eurogaran.com/downloads/tpipe/ .

    It's used like this,

    ## Reinject sub-pipeline stdout into standard output:
    $ pipeline1 | tpipe "pipeline2" | pipeline3
    
    answered Jan 24 '12 at 7:34
    Image (Asset 7/10) alt=

    Be careful here; just catting them will end up mixing the results in ways you may not want: for instance, if they're log files you probably don't really want a line from one inserted halfway through a line from the other. If that's okay, then

    tail -f /tmp/p1 /tmp/p2 > /tmp/output

    will work. If that's not okay, then you're going to have to do find something that will do line buffering and only output complete lines. Syslog does this, but I'm not sure what else might.

    EDIT: optimalization for unbuffered reading and named pipes:

    considering /tmp/p1 , /tmp/p2 , /tmp/p3 as named pipes, created by "mkfifo /tmp/pN"

    tail -q -f /tmp/p1 /tmp/p2 | awk '{print $0 > "/tmp/p3"; close("/tmp/p3"); fflush();}' &

    now by this way, we can read the Output named pipe "/tmp/p3" unbuffered by :

    tail -f /tmp/p3

    there is small bug of sort, you need to "initialize" the 1st input pipe /tmp/p1 by:

    echo -n > /tmp/p1

    in order to tail will accept the input from 2nd pipe /tmp/p2 first and not wait until something comes to /tmp/p1 . this may not be the case, if you are sure, the /tmp/p1 will receive input first.

    Also the -q option is needed in order to tail does not print garbage about filenames.

    Image (Asset 8/10) alt=
    the more usefull will be: "tail -q -f /tmp/p1 /tmp/p2 | another_command" as it will be done line by line and with -q option it will not print any other garbage – readyblue Oct 22 '14 at 19:27
  • for unbuffered file/named pipe use: tail -q -f /tmp/p1 /tmp/p2 | awk '{print $0 > "/tmp/p3"; close("/tmp/p3"); fflush();}' & now the /tmp/p3 can be even named pipe and you can read it by simply tail -f /tmp/p3 all this is UNBUFFERED = line by line there is however small bug of sort. the 1st file/named pipe needs to be initialized first in order tail will accept the output from the 2nd. so you will need to echo -n > /tmp/p1 and than everything will work smoothly. – readyblue Oct 22 '14 at 20:47
  • 1

    The best program for doing this is lmerge. Unlike freihart's answer it's line-oriented so the output of the two commands won't clobber each other. Unlike other solutions it fairly merges the input so no command can dominate the output. For example:

    $ lmerge <(yes foo) <(yes bar) | head -n 4
    

    Gives output of:

    foo
    bar
    foo
    bar
    
    answered May 11 '17 at 22:16
    Image (Asset 9/10) alt= Question feed