How to split large file into several smaller files – Linux

linux, split file linuxHave you ever want to split a large file into several small files? I’ve face this problem few days ago. I need to split a large file (3GB log file) into several smaller file where i can read it using normal text editor.

To split large file into several smaller files, you can use split command in linux. Just follow the steps below and you will be able to split large file into smaller files.


  • in your shell key in
  • Done. You just split your large file into several smaller files

* You can change the output file size by changing the –bytes=1m to your preference. You can use b, k, or m. b represent bytes, k represent kilobytes, m represent megabytes.

To restore the original file, you can use cat command.
To join all the smaller file to restore the original file type:-

Technorati Tags: , , , , ,

Related posts:

Share this with your friends:-

38 Responses to “How to split large file into several smaller files – Linux”

  1. raha says:

    Hi, I have a large file in Linux is 235821863 line,I want to remove the file line 4788139,I then convert the file into two smaller files,Each file contains a header that is,How do I use the command

  2. David says:

    Thanks for the info – it saved my day!

  3. Vikrant says:

    Hi All

    I want to know how do i seprated out the list of file in linux

    like in one dir. i have list of file *.log and some with *.txt and spme *.sh

    how do i seprated out all any command sepcific option in ls cmd

  4. ji says:

    please give an example

  5. hans says:

    Thanks! One of the most useful commands I’ve found! 🙂

  6. Stevo says:

    Your article is helpful, but slightly wrong. It should read :

    split -–bytes=1m /path/to/large/file /path/to/output/file/prefix

    an extra – in front of bytes..

  7. chua says:

    Bipin Bahuguna: if it’s text file then probably u can compress it using tar gz. then split into smaller files.
    i don’t have experience working with such a large file yet. so the solution might not be working, anyway it’s always good to try.
    Good luck 🙂

  8. Hi,

    My application created up to 400gb log file.

    is there any way to compress it .any suggestion?

    Bipin Bahuguna

  9. thiyagi says:

    thanks guys, all the post was really useful..

  10. leoj says:

    Hi guys is there any way to split a file without knowing its size in specific pieces. For ex file.txt splitted into 4 pieces.
    If not is any way to get the size (only the size) to divide it

  11. max says:

    zip -s 5000000 output input1 input2 etc.

  12. ToxAtec says:

    I think froff meant the really single – .
    From the split manpage:
    “With no INPUT, or when INPUT is -, read standard input.”

    For example, it tells split to read from the pipe (created by the ‘|’) instead of a specified file. Or it tells tar to write to standard output.

    By the way, this article / discussion is really helpful!

  13. chua says:

    the – is the way to include additional options.
    it’s very common in linux and unix

  14. froff says:

    What does that ‘-‘ single character mean in command line?


  15. MadManII says:

    Whoops, guess I should have proof read that.

    After the tar commands, and parameters (jcf, cf, cfz, etc.) and after the -, you’ll of course need to tell it what file(s) to use and/or what directory and after the split command and parameters, just before the output filename, be sure to use the final – as well.

    Here’s a better, yet simple example (remove quotes):
    “tar cvfp – filename.iso |bzip2 -9 -c |split -a 3 -db 500M – split_file.tbz.”

  16. MadMan says:

    Attention to: Grahack

    tar cfz – |split -b 100M split_file.tgz. (note the last dot, optional though)

    gzip with level 9 compression:
    tar cf – |gzip -9 -c |split -b 100M split_file.tgz.

    tar jcf |split -b 100M split_file.tbz.

    bzip2 with level 9 compression:
    tar cf – |bzip2 -9 -c |split -b 100M split_file.tbz.

    then simply cat split_file* > file.tgz or .tbz and untar/gunzip or bunzip2 accordingly.

    In most Linux distro’s, this should work as is. In OpenBSD or FreeBSD, I think the syntax may be a little different – the man pages will say though. As for other *nix OS’s, YMMV. =)

    Hope this helps…

  17. Grahack says:

    Hi all,
    resurrecting this old interesting post to ask a question about tar.
    I’m creating an archive doing:

    tar cfz /media/disk/0912171726.tar.gz /data

    Is there a way to use ‘split’ and a pipe to create a few files WITHOUT creating the big file first?

  18. tijani says:

    for 3gp files they have a header that quicktime or other players look for to be able to play it. So you cannot use split to split it to 2 3gp files ever you change the extension. You have to use a special splitter for media files that copy also the header of the file.

  19. chua says:

    not really familiar with 3gp file.
    but i think u can try Ulead media studio, or Adobe Premier.
    Both are good for video editing.

  20. Jim Clayson says:

    Okay, thanks for your reply.

    Do you have any suggestions for a suitable video editor?


  21. chua says:

    hi jim, once u split the file, you cant play it in any player.
    You should use Video editor to cut the file and re-render it.
    This linux command is not suitable for ur case.

  22. Jim Clayson says:

    Oh and one more thing is I’m using cygwin on winXP to perform the split.


  23. Jim Clayson says:

    How should I split a ‘.3gp’ file so that I can upload to facebook without exceeding their upload size limit?

    I tried using split –bytes1024K V181109_13.50.3gp pref

    .. but this produced two files both of which Quicktime can not play.

    I renamed the files with a 3gp suffix before I tried playing them but that, I think is not relevent.


  24. Ian says:

    Thanks! I had to import a ~50 MB sql file on a remote server. It would take too long for the sql to script to run and throw security errors. It was difficult to work with a ~50MB sql file as most text editors would lock up. This command, with some slight alteration, worked great as I was able to split the file up into smaller chunks and import chunk by chunk… thanks for the tip!

  25. Mark Stelios says:

    What would the command be to split a 2.52gig file in half in Linux?

  26. Nofew says:

    While you can’t use the Type command, you can use copy /b on windows:

    copy /b “file.01″+”file.02″+”file.03” “file”

    The /b switches the command into binary mode.

  27. scign says:

    can’t use “type” on windows if you’re working with binary files. great for ascii though. type translates some control characters so output is not the same as input for binary files, which therefore end up corrupted.

  28. […] split -b200m book.rar book-split  Here 200m is 200 MegaByte book.rar is the source file to be splitted. and book-split is prefix of generated file. To restore the file: cat book-split* > NEWFILENAME Source: […]

  29. zazuge says:

    ah forgot to say thanks i needed that to split a huge file i needed to give it to a friend on 2 512Mb USB-Flashs
    the command i figured it out because it’s dos equivalent to cat
    it’s better like that then to use zip multipart, it’s make u cooler in your friends eyes 😉 (using linux makes u smarter )

  30. zazuge says:

    actualy in windows you can do:
    $> type filepart1 filepart2 > filefull

  31. ych says:

    I’m using cygwin on WXP to try to put my split files back together, but I cannot cat a file larger than 4GB. Is there another way?

  32. mux says:

    If winxp does not have the copy command you can download the linux cat command for windows. Go to and download the unix utils, they’re free.

    Then you can restore the files just as with the linux example above.

  33. lutzfer says:

    Very nice, but I need to recover the original file in a windows XP system. Command copy not found in execute prompt. What can I do then? Thx.

  34. przemeq says:

    Just in case anybody needed (I did): if you use this command to split a binary file on *NIX and then copy the output to a DOS-aware system (Windows, for that matter) you can concatenate all the chunks using:
    $> copy /B chunk* output

  35. Adam says:

    thanks for this post! i found it ’cause i needed it.
    one thing, there should be two dashes before ‘bytes’.
    split –bytes=2m /large/file /smaller/files/prefix

Leave a Reply