This site, formerly known as ‘zebpedersen.co.uk’, has undertaken its third official server move. Since the first post on Sept. 15th, 2010, the site has inhabited Uncle Meat – an HP DL380 G3 which was loud as hell, Lumpy Gravy – a homebuilt PC which was much more civilised, and most recently Phaze III – a rented VPS on a 200 meg pipe (up from the 512k home broadband the original machines ran on).
That server grew to be slow as hell, and I’m serving more professional content now which means uptime is important, and the old box wasn’t cutting it, and was expensive. And thus we arrive at today – the old site, reborn on a new, faster, more reliable, cheaper server (named Thing-Fish), and with a new domain to boot. The original zebpedersen.co.uk will soon redirect to the new zebpedersen.com, my portfolio site, so update bookmarks etc… to the somewhat new zeblog.co.
One of my favourite things about Linux is its ability to transform a formerly useless elderly computer into something really useful indeed. This is the first in a series of posts covering some great projects to breathe new life into old machines using Linux.
What’s the Project?
BitTorrent is a fantastically useful distribution mechanism – when downloading updated Linux distro ISOs, it regularly exceeds the speeds available through centralised (HTTP/FTP) download repositories, and the sharing of bandwidth and ease of distribution are a great fit for online communities and fan groups (Zappateers, for example). Unfortunately, BitTorrent tends to be a total disk I/O hog, constantly reading and writing tiny chunks of file all over your disk. This causes big slowdowns for all other running programs which need access to your hard disk, not to mention the stress it puts on the drive itself.
We’re going to use an old computer, GNU/Linux and the Transmission BitTorrent client to offload this resource-intensive process to a networked machine, freeing up your main machine to get on with business unfettered. Sacrificing convenience, however, is not an option. Therefore, we’re going to implement the following convenience features:
- Administering the client through a web browser
- Sharing content over the local network
- Creating a zero-setup process for starting new transfers
I was contacted a couple of days back by a reader who wanted to add my super simple WordPress backup program to Cron, the Unix task scheduler, to create automated regular snapshots of his site. Backing up my website using the program is something I normally do by hand as part of my general system maintenance routine, but the program is easily incorporated in to Cron to run on a timer. Completely by coincidence, I suffered a WordPress plugin failure within a couple of days of creating the Cron job, and thanks to my now daily backups I was able to restore from a 10 hour old snapshot with the minimum of fuss. (more…)
For me, TWiT network is one of the most valuable resources on the internet: their programming is always informative, slick and balanced, and the guests are always top-teir tech talkers.
As with most pod/netcasts broadcast from the States, the scheduling of live shows is always at pretty peculiar hours of the day thanks to the time difference, which means waiting for the edited version to be uploaded and published to the TWiT website.
Handily, the editors seem to work much quicker than the web folks over in Petaluma, CA, and the shows are normally uploaded quite a bit earlier than they are listed on the website. I’ve been using a little Python program which I wrote to download the shows before they are linked on the site, which you can grab using the instructions below. It uses cURL to download the files so it’ll work on Macintosh and Linux systems.
A couple of days ago I posted a small program that I created to facilitate easy creation and management of RAM disks on a Linux system. At the time, I missed a prime opportunity to do some benchmarking:
Using a RAM disk, you gain a working folder with speed way far in excess of any hard disk or even solid state drive
Just how ‘way far in excess’ is the speed of a RAM disk than a traditional hard drive? Way far indeed, and here comes the proof:
Ubuntu Linux made its way onto my 13″ MacBook Pro this week via external storage. While installing, I was armed with just the laptop itself and a wireless-only internet connection, a real problem given that Ubuntu doesn’t detect the WLAN card out of the box.
I put together a disc to get the machine up and running which includes the DKMS package and the WLAN & nVidia display drivers. It’s not a full collection of files needed to get the laptop playing perfectly with Ubuntu, but it takes care of the most important core functionality.
You can get hold of the ISO by following this link, and I’d also highly recommend reading up on this page, although I’ve outlined and expanded on what I consider to be the most useful pieces of information from it below. I used Ubuntu 11.04 on a 13″ MacBook Pro 7,1 for everything in this post.
I, like millions of others, use Mac OS X as my main operating system. It’s fast, reliable and secure, and the computers it runs on are undeniably the best designed and built machines available on the market. There are many options available to users who need the added flexibility of running Linux or Windows alongside OS X, perhaps through SSH or by using a Virtual Machine. Sometimes, though, you need a full, non-virtualised OS environment to work in, and while Boot Camp is great it’s not ideal for someone like me who rolls with a very fast, but very small, SSD boot drive.
In this tutorial, I’ll show you how to install Linux to any external USB device and boot your Apple computer from it. I’ll be working with Ubuntu 11.04 32-bit and a MacBook Pro running Mac OS X 10.6 Snow Leopard. Guides elsewhere online seem to only help you if you happen to already have a Linux box to work with – my tutorial only requires one Macintosh computer. All the software used herein is open source and free of charge.
Last week, I published a seven part series of posts introducing some of the key concepts, commands and techniques of the Linux/Unix command line. Here’s an index of the topics covered to help you find what you’re after.
It seems that I’m not the only one who finds the tinsel of an operating system GUI distracting when I’m trying to concentrate on some work. While many Linux distributions allow you to fullscreen a Terminal window, Mac OS X’s Terminal.app does not provide you with such an option. Although I did (briefly) consider rebooting into single-user mode (which starts OS X with a BSD shell), it seemed like a better idea to try and find out if anybody has already had the same thought as I, which led me to a great solution over at http://www.hccp.org
In order to enact this process yourself you’ll need X11 and the Mac OS X Developer tools installed.
Knowing basic *nix terminal commands is an absolute must for any computer pro. Whether you use Windows, Mac OS or Linux, you’re bound to face the command prompt at some stage, so here’s my crash course in CLI. The final sections of the guide will introduce a couple of slightly more advanced concepts which will help you get the most out of your command line experience.
Remote Access 101
When you open a Terminal window on your Linux or Mac computer, you’re interacting with the operating system through an interface known as a ‘shell’. The Terminal you just opened interfaces with your local computer, but using a protocol called ‘SSH’ you can control a remote computer using the same Terminal interface. (SSH stands for ‘Secure Shell’, owing to its increased level of security over its predecessors.)
SSH allows you to open a Terminal session (a shell) on your local machine which interacts with a remote computer. This is most commonly used in server administration or any computer which runs ‘headless’ (without a monitor).
Having a few SSH tricks in your toolbox is very useful indeed, and later on I’ll show you how to bypass network restrictions and use remote GUI applications as well, but first we need to get connected.