View Single Post
Posts: 540 | Thanked: 387 times | Joined on May 2009
#4
That is a TON of questions.

I do NOT recommend "Learning Linux" from an embedded device such as the N900. Stick with Linux installation on a desktop/laptop or even live-cds if you have to. For partitioning burn a GParted LiveCD. Go with Grub1 (0.97) instead of Grub2 so you can get an understanding of boot parameters.

Attempt a Gentoo install, I'm sure by the time you have that up and running most of your questions will be answered (I am kidding, that's a cruel way of indoctrinating someone into Linux - I know because I was talked into that when I was n00b. Very masochistic distro.)

Compiling packages can vary, A LOT. Especially if you are cross-compiling.
The "standard format" is:
1. Download the source archive
2. Uncompress (x) it with tar, piping it through gunzip (z), be verbose (v) and use file (f). Use tab-completion to fill in the filename (press Tab twice or CTRL+I on some distros). If it's a *.tar.bz2 then subsitute z with j (tar xjvf foo.tar.bz2)
Code:
$ tar xzvf coreutils-5.0.tar.gz
3. From the shell navigate into the newly created folder
Code:
$ cd coreutils-5.0.tar.gz
(Optional) List the contents of the current folder
Code:
4. $ ls
5. See the configure options available for this particular package if any, for example ./configure --prefix=/usr Then run ./configure (with any parameters if needed)
Code:
$ ./configure --help
$ ./configure
6. Run Makefile through GCC
Code:
$ make
The compiled binaries can be run from the directory such as
( cd src; ./hostname OR /src/hostname )
7. As root install the compiled binaries in the appropriate places in the userland.
Code:
# makeinstall
As far as extensions go, a lot of the older stuff still uses extensions and really extensions make a lot of things easier. For example a common way to package source code is as a .tar.gz, which is a tar file inside of gzip file. Archives use compression which is basically poor man's encryption (I'm not saying that's the intended purpose) meaning the program can not necessarily easily discern the mimetype header information to determine the filetype, particularly if there is any corruption. Hence the file extension is still useful.

Binaries tend to lack a file extension but is because the files are run directly and have executable permissions set. Additionally scripts usually lack a file extension (but is useful as a hint) because they have a shebang (examples: #!/bin/bash #!/bin/perl #!/local/bin/python)

Before you go and remaster a distro or go so far as to attempt as Linux-From-Scratch you should simply try out as many distros as you can and see how they vary. I would try: Ubuntu (Debian-based), Fedora (Redhat-based), Puppy, pre-5.0 Knoppix, Sabayon, Gentoo and really as many as you can. You'll see that they vary greatly. See here: http://en.wikipedia.org/wiki/List_of..._distributions
Find an old desktop (or really just a hdd that doesn't have any files you want) and burn a bunch of ISOs, wipe the hdd and play with some different distros. They all have their pros and cons.

Package managers are an entire discussion by themselves and not even in regards to a comparison of the different formats (.deb, .rpm, .ebuild, .runz, etc)

Regarding your comment about vi. That's to do with the PATH variable.
$ echo PATH
and you'll see directories such as /usr/bin listed, you can append other directories to your PATH variable (you should be able to use .bashrc for persistence)

GCC all I'm going to say it's don't attempt this on N900 and for Ubuntu-based you'll need the package build-essential among others.
Code:
# g++ INPUTFILE.c -o OUTPUTFILE `pkg-config --cflags gtk+-2.0` `pkg-config --libs gtk+-2.0`
Compiling is taking a human-readable file and making it machine-readable and typically build for that specific hardware (for efficiency)

Dist-masters choose which packages they want to include in their repos. For example debian-based use man-db and redhat-based (and other traditionalists) choose to still use man.

There are a lot of competing projects, just look at Gnome and KDE and their respective GTK and QT libraries.

Beryl and Compiz used to be competing projects but now they have merged as Compiz Fusion.

The Xorg you use on a Desktop is actually more-than-likely x11-xorg which is a fork of x11-xfree86.

A good resource of the GNU/Linux history is Wikipedia. I recommend you read up on the GNU userland and other deviations/alternatives such as the [failed] Gobolinux project.

You certainly have a lot of questions. Google is your friend, along with Wikipedia and also the forums unix.com and linuxquestions.org
Here's a couple of links that may be of some use:
http://wiki.maemo.org/index.php?title=Terminal (I wrote up some of that but it's been changed a lot since then)
http://ss64.com/bash/

Your best bet is to learn to love terminal. Everything in Linux revolves around it. Learn about VTs and the screen command while you are at it.

The man command will help you out a lot. info is competing help file system. You can also often command --help

Specific questions help, as "teach me Linux" is just plain impossible. No one knows everything and it's something that you simply learn more about every day. You simply need to use it as your primary OS.

Good luck. Get to know Tux

P.S. I made a 'lil "cheater" function for Step1-3:
Code:
$ wtzc () { wget "$@"; foo=`echo "$@" | sed 's:.*/::'`; tar xzvf $foo; blah=`echo $foo | sed 's:,*/::'`; bar=`echo $blah | sed -e 's/\(.*\)\..*/\1/' -e 's/\(.*\)\..*/\1/'`; cd $bar; ls; }
$ wtzc http://mirrors.kernel.org/gnu/coreutils/coreutils-5.0.tar.gz