I often need to extract an archive, sometimes, I just need to look into file content especially for source-tarball-only projects. Not necessarily mean I need to compile it, but there is also a benefit if you compile source in file system which resides in memory, such as tmpfs, it is fast.
My
tmpfs is fast and by using it rather than normal file system do not reduce your hard disk life since it reduce the number of writes on physical disk. If you haven't taken advantage of it, you should from now on. Physical disk file system is really slow, comparing to tmpfs. I don't know how fast SSD could be, but I don't think it can beat tmpfs, it requires to go through disk interface before data reaches or is retrieved from memory of SSD disk.
It always takes me a while to figure out what is the correct option to tell archive program to extract to
So, I always create a new directory whenever I have suspected such thing might happen. I don't want to move the archive, so this extracting directory is needed. Of course, I can move working directory to the new directory, then use relative path to the archive, but why miss a chance to learn something you don't know?
However, tmpfs isn't all that good for all scenarios. Computer crashes and content in tmpfs isn't persistent. Once power is gone, tmpfs is phewwoof, too. One time, I lost a few pages of writing in LibreOffice, there was no recovery process. Heck, LibreOffice didn't even know about the crash, the temporary files were gone with the crash.
After that experience, I changed LibreOffice's temporary directory to a newly created directory which resides in physical disk. So, I wouldn't yell "Frak!"
My
/tmp
is tmpfs, not a file system on physical disk. Not only programs needs it, I rely on it, even it only has 1 GB size, it already helps a lot. I so wish my computer had like 16 GB memory, I could compile anything on it.tmpfs is fast and by using it rather than normal file system do not reduce your hard disk life since it reduce the number of writes on physical disk. If you haven't taken advantage of it, you should from now on. Physical disk file system is really slow, comparing to tmpfs. I don't know how fast SSD could be, but I don't think it can beat tmpfs, it requires to go through disk interface before data reaches or is retrieved from memory of SSD disk.
It always takes me a while to figure out what is the correct option to tell archive program to extract to
/tmp
. So, here is a short list which may help me:bunzip2 -c foobar.bz2 > /tmp/foobar gunzip -c foobar.gz > /tmp/foobar tar xf whatever.tar.supports.tar.ext -C /tmp unrar x foobar.rar /tmp unzip foobar.zip -d /tmpYou may ask, didn't you download archive to the directory already? If you has archive with packing convention, then you know there will be a new directory after you unpack it. But that's not always the case, especially when you have Zip packs. If you are not careful, you will end up having like a hundred of files in your current directory mixing with other files, then you swear about it if you just can't remove all files and redo.
So, I always create a new directory whenever I have suspected such thing might happen. I don't want to move the archive, so this extracting directory is needed. Of course, I can move working directory to the new directory, then use relative path to the archive, but why miss a chance to learn something you don't know?
However, tmpfs isn't all that good for all scenarios. Computer crashes and content in tmpfs isn't persistent. Once power is gone, tmpfs is phewwoof, too. One time, I lost a few pages of writing in LibreOffice, there was no recovery process. Heck, LibreOffice didn't even know about the crash, the temporary files were gone with the crash.
After that experience, I changed LibreOffice's temporary directory to a newly created directory which resides in physical disk. So, I wouldn't yell "Frak!"