linux - what are the techniques to extract files safely?

07
2014-07
  • ceremcem

    Yesterday I was making some experiments on Slitaz. It uses multiple initrd.img's to store files/changes.

    I wanted to extract one of its initrd.gz images (which is a cpio archive) to a folder, edit/remove them, repack again.

    I used this code:

    cat rootfs.img | cpio -idvm
    

    Then all files are extracted to my root filesystem. My whole OS is corrupted. (What an emberrasing situation...)

    What should I do to make such operations safely but in an easy way? Chroot? LXC? (VirtualBox is the last resort)

  • Answers
  • Dan D.

    The option you want is --no-absolute-filenames:

    Create all files relative to the current directory in copy-in mode, even if they have an absolute file name in the archive.


  • Related Question

    linux - Sudoers files requiretty flag security implications
  • Phil

    So I am trying to execute sudo commands via cgi-bin perl scripts. I want to give sudo access to the apache user for a small subset of commands. Someone before me may have set the requiretty flag. Things like cron and cgi-bin scripts do not get a tty session, so currently if i try to sudo in my script, it tells me about the flag. Also, the apache user will sudo without a password.

    My question to you all is, what are the security implications if I were to disable this flag and continue writing my script?


  • Related Answers
  • John T

    If your script was prone to any sort of injection, all commands entered by a malicious user would be run as root. I don't think it gets any more dangerous than that :)

    I ran into this same issue a while ago. I ended up having user jobs submitted into a "queue folder", which was processed by a script I ran through a crontab every few minutes. The files in the queue were parsed by my scripts with regular expressions and any files which contained invalid characters (eg. .*<-_>![]{}()\|/;) were discarded and the user was notified to resubmit.