Feeds:
Posts
Comments

Archive for January, 2006

System Backup

I used to make the joke about learning foreign languages: when learning a foreign language, the first sentence to you should know is always how to say “I cannot speak XXX language”. That is the very first step of successful, informative and effective communication.

It seems to be a little pessimistic, if you think about it and compare it to something “it is so nice to meet you” or “what is your name”, but helps you to put the worst case under control.

When I applied this knowledge to computer adminstration, I realized that system backup or hazard control should be among the top of the list. It’s a shame to say I made myself check some document about this only this afternoon, when I finally realized I could never afford the loss of my home directory any more. It used to be losable with movies, junk programs, etc; but now, using Linux has helped me to accumulate quite a bit of quick-fix code, notes, configurations, etc.
A few days ago, I was asked to help recover a broken windows system, which refused to boot because of the infamous NTLDR missing problem. Without physical access and without XP install CD, I only offered to help the friend to copy the data out of the dead machine. I asked him to boot up with an Ubuntu Live CD and to install an open-ssh server, and I ssh-ed in and was able to mount the windows file system.

It was right after this easy and happy experience when I made a stupid mistake. I thought the NTLDR missing was because of corrupted Partition Boot Record, the first 512B of the partition. I then used dd to copy that data from a bootable windows host, and writed it to the dead machine, before I copied the friend’s data files in that drive out. I forgot what I had planned to do, and now the partition was dead and I could not mount it any more.

Why didn’t I copy out the files first? Why didn’t I backup the 512B before I overwrite it, which is such a high-risk action?

Thanks to TestDisk, I was able to recover the partition, but after like 2 hours of searching.

I later figured out that a few seconds of backup would save hours of hassle. And you know what? Backup FS is as easy as backup the 512 partition boot sector, thanks to GNU tar.

I am using the technique known as incremental backup. A level 0 backup will do a raw dump of directories you select to a tar archive. A later level 1 backup will compare the current status of the directories to your level 0 backup, and only backup the changed and new files. Similarly, a later level 2 backup will be based a previous level 1 backup. A snapshot file is how the lower level backup communicates with the higher level backup.

GNU tar, by default, do incremental backup on a single snapshot file. That is, a later backup reads the snapshot file and will also write to that very file. In this way, without workaround, GNU tar will do backup level 0, followed by level 1, further followed by level 2, and so on. Although it keeps the backup as small as possible, it does increase the complexity of restore, when you have to restore all these backup one after another. A simple backup copy of the snapshot for the level 0 backup will allow you to create independent level 1 backups. What I am doing is a recommended approach: do regular level 0 backup, and do several independent level 1 backups during the interval of 2 consecutive level 0 backups. This backup scheme will use the latest level 0 backup and latest level 1 backup to restore, and is a good trade off between backup complexity and restore effort.

Advertisements

Read Full Post »

JavaScript is the language that gives me the impression that it does not have a usable `standard’ library. When C comes with libc and Java comes with a huge built-in class library, JavaScript only provides a few objects (Date, String, etc) and a browser-related DOM. What if I want to use JavaScript to do some general-purpose computation? Of course we understand that JavaScript is a language designed to interact with browser, but using it to build richer interface means more general-purpose computation behind the scene.

Thanks to wider adoption of Ajax, there is this strong trend to use JavaScript to build rich web interface, and therefore the strong need for libraries for a lot of general functions, such as browser abstraction (thanks to the browser war and the evil/buggy IE), data structures, and even widgets. A slashdot thread gives me a chance to glance at a few of the functional libraries.

  • Dojo: Dojo is one of the early players (at least it is the first one I know about), and also one of the comprehensive ones. It even comes with Crypto sublibrary. Dojo is using Rhino for development, which I think is interesting. Gotta give Rhino a try.
  • Mochikit: I have not had a chance to read it carefully yet. At least I like the document. So I will study on that.
  • Qooxdoo: First of all, bad name. It is using a different approach. Developers are encouraged to create web application using JavaScript code rather than HTML. Work is done much like in general GUI. For example, developers will create a window distance, define several callback function, etc. It’s more like a widget library that happens to have the backend in browser.
  • Behaviour: Another way to bind callback function to a <a>. I like this work very much. It is a good way to design application backward-compatible to NoScript browsers: no JavaScript action is bound to element in the HTML, and callbacks are bound in JavaScript code, which means that, no JavaScipt, no bound. Besides, code is so much cleaner and maintainable. Excellent job.
  • JSON: Another way to exchange data between server and client sides. Normally XML is the way to go. JSON seems to be simpler and can directly map to native object without any extra coding work.

Read Full Post »

There are people thinking me as a freak when I realize I am using Linux as my daily OS, even in a computer science department. When I told them how great Linux is and how it is getting improved in an open manner, there are arrogant refuse, “it is so much worse than my Windows”.

As we are living in a world where Windows dominate, unfortunately, it might take some time and suffer a lot to switch to Linux. It would be interesting to see how someone with a strong UNIX background forced to switch windows. It’s a fun reading, and definitely can help the Windows people have a better view of Modern Linux.

I have used Windows exclusively for 4+ years before I completely switch to Linux 2 years ago. I have to admit I have gone through some bad path: finding similar applications those I used on Windows (like gftp vs leapftp), installing wine, etc. In other words, I am creating another pseudo Windows environment. I failed to realize that the Linux has a different culture from that of Windows, and the Linux one is more superior in almost every manner, especially in productivity and flexbility. When I have get used to the Linux way of doing things, Windows is not an option any more, and I just have the same feeling as the guy above: it is so hard to benefit from open source community, all the handy little tools are not installed by deafaut nor easy to get installed, and tools in windows are designed in such a non-integratable way.

Windows will remain not an option for me, unless it gets POSIX compliant and has a full GNU port.

Read Full Post »