Back in 2000, when I first ventured onto the 'Net, the majority of home computers got on line via 56K dial up modems over telephone lines, and the Internet was mostly built around those needs. I too did the occasional heavy lifting on my office's computer, connected to a much fatter pipe than a telephone line, but was able to deal with mail, news, and web, on a 56K modem. Times have changed: increasing home access of internet resources, and the proliferation of home computers, have brought down the price of broadband. Most Americans are on DSL or cable modem connections, and the idea of dialing up a connection seems quaint, even antiquated. But there are still reasons some of us dial up: for starters, I live in West Africa, where I count on dial up internet as a backup. Furthermore, my old PIII 555Mhz laptop is still going strong, but the network card no longer works (the bus was fried in one of our many power surges) and a modem is all I've got left. With some careful planning, dial up isn't so bad. But in a world where the average webpage is now an order of magnitude heavier than it was back when everybody dialed up, some planning is indeed necessary. Linux to the rescue. It's easy to set up a Linux computer to run a downloading mission every time it connects, and take care of your mail and even your basic web browsing, all in one fell swoop. Here's the secret: Web Surfing via RSS If you think about it, your typical web browsing session probably only consists of a few sites that you visit regularly. If those sites offer RSS feeds, downloading to an offline RSS reader allows you to collect headlines from all your sites in a single click. That's convenient even if you're on a faster connection (it will cut your websurfing time in half, that's for sure). There are at least two good pieces of software that do this: Curn (written in Java) and Rawdog (written in Python). I tried and liked both, but wound up staying with Curn. It will fetch the RSS feeds you configure, and is then capable of doing just about anything you want with the information, from sending it to an email address to passing it to another program, to creating an HTML file. Curn has a nice, graphical installer that puts the program in /usr/share/curn. There's a lengthy configuration file that you can use to perform some complicated tasks; I found it only necessary to modify a few lines, as follows. All these configurations are within the sample cfg file, and it's just a matter of un-commenting them as necessary and adding the information that pertains to your system. I configured Curn to do two things with my RSS feeds, send them to a local user by email, and create an HTML file in my home directory. The email is useful, but I find myself reading the news in Links or Lynx, scanning the headlines for interesting articles. It's an extremely efficient way to read the morning news! Here's the code for the mail section: SMTPHost: localhost MailFrom: Curn User MailOutputTo: randymon@localhost MailSubject: RSS Feeds Here's the code for the HTML file section: [OutputHandlerFreeMarkerHTML] SaveAs: /home/randymon/output.html Title: "Today's RSS Feeds for the Randymon" Encoding: ISO-8859-1 TemplateFile: builtin html Finally, all you have to do is tell Curn which feeds to go fetch, and assign each one a name. Use the following as examples: [Feed0] URL: http://www.gotonicaragua.com/forum/index.php?action=.xml;type=rss [Feed1] URL: http://www.nytimes.com/services/xml/rss/nyt/HomePage.xml [Feed2] URL: http://www.washingtonpost.com/wp-dyn/rss/print/index.xml And that's it. Execute the command with the following command: curn /usr/local/curn/rwood.cfg. The program executes quickly, and within 30 seconds of connecting to the internet I can find the file output.html in my home directory. I open it with Lynx or Links, and start reading. Email Downloading email as efficiently and quickly as possible is not much of a problem, either. Here, the trick is the fetchmail program. Install fetchmail and put your personal configuration file in your home directory. It should look something like this: poll MAILSERVER.ISP.COM with proto pop3 user "MYUSERNAME" there with pass "PASSWORD" is LOCALUSERNAME here limit 40000 keep Obviously, substitute everything in caps with your own data. The two nuances here are the limit of 40000 (that's bytes) and the "keep." With the former, I'm instructing fetchmail not to fetch any message bigger than 40K, so anything with an attachment. With the latter, I'm instructing fetchmail not to remove anything from the server, as I'll access my mailbox the normal way from some other computer with a faster connection. In sum, we download a local copy of any smallish messages using the POP3 protocol, but don't remove them from the server. This obviously only works if you have an ISP that allows you to get your mail using POP3, but many do. If you can't do this, your only alternative is to access your mailbox via webmail (much slower) or IMAP (faster), using a dedicated mail client like mutt. Usenet News Leafnode is the trick of the day for offline Usenet news reading for those of us who still like this now outdated but essential forum. It's not as tricky to install as you'd think, and pretty much comes down to two configuration files, both located in /etc/leafnode. In the file "config" you specify your news host and the newsgroups to download, and in the file "filters" you specify messages you don't want to download based on the contents of their headers (spam, probably). For example: server = news.ISP.net noactive = 1 username = MYUSERNAME password = MYPASSWORD There are one or two nuances to setting up Leafnode, including the requirement that you choose a fully qualified domain name, but I'll let the Leafnode docs walk you through that easy process. Lastly, don't forget to set up your filters! In the "filters" file, include lines like the following: ^Message-ID:.*googlegroups.* ^Message-ID:.*usenetnow.* ^Message-ID:.*usenetmonster.* ^Subject:.*paypal.* ^Subject:.*trainer.* ^Subject:.*video.* The above rejects any message posted from one of those three sources, and any message with certain keywords likely to be spam. Automating the Connection Rather than connecting, add the commands for the Curn run and fetchmail to your PPP scripts, and it will happen automatically. SUSE Linux runs the /etc/ppp/poll.tcpip script upon successfully negotiating a dial up connection. You can do anything you want automatically by adding it to that script. So for example, the following lines are what I added to the stock SUSE script to have the system automatically run fetchmail, curn, mail the outgoing mail in the queue, and get my Usenet news using Leafnode's Fetchmail command. #!/bin/bash while true ; do test -x /usr/bin/fetchmail || break test -r /home/randymon/fetchmailrc || break /usr/bin/fetchmail -f /home/randymon/fetchmailrc break done # Do we use curn for RSS? while true ; do test -x /usr/local/curn/bin/curn\ \ || break test -s /usr/local/curn/rwood.cfg\ || break \ /usr/local/curn/bin/curn /usr/local/curn/rwood.cfg \ chown randymon:users /home/randymon/output.html break done # Send out all outgoing mail using Sendmail /usr/sbin/sendmail -q # Do we get news with fetchnews? while true ; do test -x /usr/sbin/fetchnews || break test -s /etc/leafnode/config || break test -e /var/lock/news/fetchnews.lck && break /usr/sbin/fetchnews break done Ultimate Keystroke Efficiency The final step is to set up bash aliases, or short bash scripts, to reduce the amount of typing and clicking you do. For example, SUSE Linux comes with a program called "kinternet" which connects and disconnects your PPP connection. Its command line interface is "cinternet," and once you've configured your ISP and dial up number, you can connect from the console using a command like cinternet -i ppp0 -p Connection1 -A So, put that line in your .bashrc file as an alias, and you can control your connection to the internet with a single word typed at the console. The following lines are what I added to my .bashrc: export PATH=$PATH:/home/randymon/bin/:/usr/local/curn/bin/ alias connect="cinternet -i ppp0 -p Cotonou -A" alias curnupdate="curn /usr/local/curn/rwood.cfg" alias disconnect="cinternet -i ppp0 -O" alias pass="mv ~/output.html `date +%d%m`output.html" alias readnews="links ~/`date +%d%m`output.html" They are my connect and disconnect commands, a command to run curn directly, and a command that takes output.html (generated by curn) and makes it into something like 1705output.html (where today's date is 17 May). I did the same thing for root's .bashrc, like this: alias down="shutdown -h now" In sum, what I can do is power up my old box, log in, and type the following: connect pass readnews su down ... and be done with it. My system connects to the internet, downloads RSS feeds, mail, and Usenet news, changes the output.html file to a dated version, lets me read it using Links, and then after I become root, shuts down with just one word. This doesn't solve all my problems (no downloading video from Youtube over 56K, that's for sure), but it does make life over 56K pretty convenient. Bootnote: A Word about Dial Up Let it be said, that the change in lifestyle from broadband to dial up is healthy. No more surfing wildly around the web, since the clock is ticking (and you may be paying for minute). And you compose your mail offline in a dedicated mail client and a good editor, rather than pounding out quick messages in a webmail interface. There's a fundamental modification of mindset when the Internet goes back to being something you connect to briefly to send and receive. That's old fashioned, yes (amazing that old fashioned refers to habits of less than a decade ago), but somehow healthy to disconnect from the neverending novelty nozzle. Image courtesy of photographer James Brittin (www.jamesbrittin.com)