21 November 2013

The £125 upgrade

This is an illustrated recount of the upgrade of my rather big laptop, with full commentary and explanations of my choices and the reasons behind them.
The laptop in question is the really great MSI GT640 which has been faithfully serving me for the past three years. However, times are changing and I had been getting increasingly aware that my usage patterns, as bad as they are, were pushing the limits of this computer.

The target of the upgrade, an MSI GT640. Awesome machine!


The main weak point of the GT was its memory, or shortage thereof. The computer came equipped with 4GB of RAM, of which only 3GB were usable due to the 32-bit version of Windows 7 that came pre-installed with it. Back when I first got it, that wasn't much of a problem, but my annoying habit of having Firefox run with a couple of hundred tabs open simultaneously, quickly exhausted the supply of free bytes and forced Windows to start extensively using the disk-based pagefile, leading in horrible delays (and consequent annoyance).

The Elixir pair of 4GB RAM modules

Predictably, the first item of business in need of an upgrade was the RAM. The street price for laptop RAM modules isn't bad at all, however I made a conscious choice and went for second-hand and eBay. These days, most gaming laptops come equipped with 8GB of RAM and a lot of gamers upgrade to 16GB almost immediately; therefore, there are a lot of practically unused RAM modules for sale out there. One of these sets found its way into my hands for a reasonable £35.

However, the simple addition of RAM would only mean that the amount of RAM that the 32-bit operating system cannot use would increase from 1 to 5 gigabytes. The operating system would have to be changed to a 64-bit version, so that all 8GB could be used.
The bad thing is, you can't just upgrade an existing 32-bit installation of Windows to 64-bit; it has to be a clean installation. The good thing is, though, you don't need to buy a new version of Windows; the same licence can be used for either 32- or 64-bit versions.
Since I was considering reinstalling Windows and formatting hard disks, I had been thinking of getting an SSD (solid-state drive using flash memory chips instead of rotating discs) for quite some time. They've been around for long enough, gone through their teething issues so now they're very fast and at the same time, cheap enough to replace the system drive of a computer without having to sell your first-born to get one.
 
The new SSD: very fast and very stylish!

There was a model that had caught my eye for being very fast, the new-ish Samsung 840 Pro; the 128GB model being just perfect for my needs as a system drive. Obviously, I could find something a bit cheaper and not lose a lot in speed, but I knew that in the end I'd just be regretting not spending that little bit more. In for a penny, in for a pound. After a brief search, I managed to get a brand-new sealed one for an amazingly low £85!


The logical course of action would be to remove the old mechanical (i.e. spinning discs) 500GB Seagate drive and replace it with the new SSD, but having free space for movies and photos is hard to give up. Thinking about it and regretting that the MSI engineers didn't save some space for a second drive, I realised that they did save space for a DVD recorder, something that I had used no more than 3 times in those 3 years that I've had the computer. The DVD drive is SATA, so I started thinking if it'd be using the same connector as a hard drive. The bad news is, they don't. Optical SATA drives have a slightly smaller connector. The good news is, there are adapters!

The no-name adaptor comes with a screw-
driver and an extra faceplate for itself or
the optical drive it replaces


The adapter is a case, with the exact same shape and dimensions as an optical drive for laptops. The main difference is that this one has an empty space for a 2.5-inch hard disk drive and the appropriate connectors: inside, a standard SATA connector, outside, a slimline SATA connector. Strategically positioned screws make sure the drive stays put, while the designers have made sure to include all the mounting holes, threads and notches typically found on optical drives, so that the adapter can replace (almost) any optical drive without issues.
Thankfully, the manufacturers have settled for more or less standard positions and gauges for brackets, clips, screws and sockets, so there is little that can go wrong there.
Most devices are interchangeable, with only a fixing bracket or two needing to be relocated.

Again, eBay is our friend and I ordered one of those for a mere £5 and that's including shipping! I had some reservations concerning the build quality (especially that of the electronics), but since the adapter electronics are entirely passive (i.e. only "wires" and sockets and no components other than, maybe, a couple of resistors or capacitors), I concluded that it would be pointless to spend more money on something that would in all probability have had come from the same factory. After a couple of weeks' wait due to China being so far away and having some national holiday then, everything was all set for the big day of the upgrade!


1 November 2013

Microsoft, Apple and the Great River

I was recently having a brief discussion with a friend about Windows 8.1 and what Microsoft did. He, a developer of a well-known desktop add-on suite, which has been around for more than a decade, felt that, in one night, Microsoft essentially made all these years of work obsolete. The transition that Microsoft tried to do with Windows, from a desktop operating system towards a much more "tablet" and touch-oriented interface, disrupted an entire ecosystem of Windows tools, add-ons and various applications, not to mention enraged the vast majority of users.
The issue of Apple in our discussion came naturally: where Microsoft keeps stumbling in the dark, Apple had the leadership of Steve Jobs, a man truly charismatic, with an incredible insight in technology and the ability to envision things nobody else could. On the other hand, Microsoft had Bill Gates, an admittedly very smart guy, but no genius, who guided his company through the times of MS-DOS, Windows 3.1 and Windows 95. Presently, Microsoft has to make do with the decisions of some Steve Ballmer, a guy with the charisma of a used cars salesman and the technological insight of a troglodyte. Think I'm exaggerating? Watch this:


Yes, this is the current CEO of Microsoft, a position he took in 2006. Note how the date matches the introduction of Windows Vista, the company's biggest failure until then? During his reign, he's seen the introduction of three new Windows versions, two of which have been huge flops: Vista and 8, where Windows 7 was simply an adequate product whose main saving grace was the fact that it came after Vista (and, of course, from Microsoft).

Obviously, mentioning Steve Jobs and the current fortunes of Apple got me thinking. For you who don't know what Apple was doing 20 or so years ago, I'll include a brief synopsis.
Steve Jobs was practically kicked out of Apple in 1985, only one year after the introduction of the original Macintosh, by ex-Pepsi boss John Sculley, a guy who Jobs himself had brought in. After Jobs' departure, Apple kept on going on sheer momentum: the 'magic' was gone. They kept on improving their Macintosh line, making newer and bigger machines, adding on what was already there, but eventually ran out of steam.
In less than a decade, they were in big trouble, as the computer world kept evolving in leaps and bounds, while Apple was plodding along. The best description of Apple's situation came from Steve Jobs himself (reportedly quoting Gil Amelio, Apple's CEO):

For you who can't be arsed to hit 'play', he described the situation, just before Apple got him back to save the company: "Apple is like a ship, with a hole in the bottom, leaking water. And my job is to get the ship... pointed in the right direction."
Apple had lost its vision and more importantly, they had lost the one person who actually *had* a vision and could convince others to pursue it.
The man who saw the computer mouse and thought "icons" when everybody else thought "text menus". Who could see the whole picture where everybody else could only see down to the next dot.
Apple had become big, complacent and its leaders were businessmen, not geeks, and therefore unable and unwilling to take risks and bet everything on something revolutionary. They were all trying to play it safe, and all inevitably failed.

Keeping that in mind, where is Apple now? What is Apple, without its great visionary?
Quite seriously, I think Apple is now too big to fail. Having said that, though, Apple would need an equally big man to lead. Unfortunately for Apple and for the rest of us, there is no such man. Men like Steve Jobs come once per generation or two, and we've already had ours.
So, I guess Apple will keep on going, like it did back in 1985, on sheer momentum and it's up to the leadership to keep it going for as long as possible. But I personally don't expect to see any major breakthroughs from them any time soon.
The days of the iMac or the iPad are probably over.

But since I started with Microsoft, I can't finish but with them again. Microsoft, much like the Apple of the '90s had a hole in the bottom, has a hole in the head. And a big one at that.
Don't get me wrong, Microsoft was always copying ideas from all over and their strategy was to buy small companies that made the "second best" programs, beef them up, relabel them and sell the final product as their own. Microsoft was never an innovator, nor a risk-taker. They never paved ways; instead, they waited until the road had been paved and then ran to the front.
Every time Microsoft had to lead the industry, they failed. Miserably.
So now, their leadership decided to copy Apple. Unfortunately for them, their leaders not being the brightest sparks in the bunch, they copied none of the right and all of the wrong things. Twice.

What the future holds for the big Apple and the Micro Soft? No, they can't fail, at least not immediately. But Apple can very easily end up being a glorified cellphone maker, just as easily as Microsoft can end up being a "business computers' operating system and office suite" maker.

Oh yeah... and a games console maker. Almost forgot... they've got Xbox...


P.S. Damn, Steve, we're gonna miss you.

Dumb terminals and silicon evolution

This won't exactly be your 'normal' news commentary. You have been warned.

David Lightman and NORAD terminal
Some time ago, I had been tinkering around with a Linux distribution - CentOS minimal, for those who care - trying to see if it's feasible to set up a web server at home. Logically, the distribution I chose was a "barebone" version, which means, no Windows-like environment, no fancy graphics, no frills. Just the basic 'console' interface and nothing more.

Now, so far, this shouldn't sound too interesting, let alone something that would justify a blog post. Well, bear with me for a minute.
The basic concept behind this, is that a web server would be a machine that would be working 24/7 and have no keyboard, no mouse and no monitor, since most of the time it'll just be... serving the internet. Usually, communication with the system is being done remotely, i.e. from another computer, through a "terminal", which is a programme emulating the function of the old computer terminals that we've all seen from the '70s and early '80s. Having that in mind, and being a lover of "antique" technology, I started thinking that I'd really love to have an actual terminal (machine) on one side of my desk and administer the server from there, rather than from a terminal (programme) running on one of my 'modern' Windows computers. Now, 'wanting' one and 'getting' one are two entirely different things; even more so since there are very very few on sale and even those few that are, would have to be shipped across the Atlantic from the United States, which in itself is forbidding because of the weight and size -- not to mention fragility of those things.

What's the alternative? Well, unsurprisingly, terminal emulators have been around since the very beginning of personal computers. In fact, the first personal computers began their careers as "smart" terminals, which were... intelligent enough to be used independently, if their level of "intelligence" was enough for someone. So my second thought was... to get one of my old PCs out of mothballs and use it as a terminal. Obviously, size was a deciding factor so, after checking out what motherboards and cases I had,
Toshiba 486 on mission: terminal (stock image)
I decided to settle on an old Toshiba notebook, equipped with an Intel 486SX CPU, 12MB of RAM and a Super VGA adaptor and monitor. It isn't the oldest nor 'weakest' I have, there are other 486 boards, 386, even a couple 8086 boards in my stash. The Tosh was the smallest, though, taking into account total desk footprint.

And right about then... it hit me.
I was going to use a 486 PC as a 'dumb' terminal.
A 486 !!!


For the record, the first PC I laid hands on was an 8086 (or 8088, who knows) XT compatible. The second was a 286. The first PC that I bought (well, my father but you know how it is) was an -- at the time almighty -- AMD 386 at 40MHz. A mean machine for its time, but even as such, a 486 was the wet dream of everybody, and all of them were really expensive -- roughly an order of magnitude more expensive -- and notebooks were equally, if not more, expensive. A 486-based notebook was something almost nobody could have back then, as they cost more than a new car.
Fast forward to today, where we have one of these machines just about to spend its final days serving as... a terminal!
For the record, the 486 notebook is about 100 times more powerful (in processing power) than required for even a 'smart' terminal and about 10,000 times or more than a 'dumb' terminal.

So, finally, we come to the "conclusion". How far have we come, fair ladies and dear gentlemen, where a really powerful computer is being assigned terminal duties. But in today's world, we don't even see it. No, the 486 was not a miracle chip of the '80s with processing power rivalling today's Cores and Pentiums, nor can it run Windows 8 (nor 7 for that matter). It cannot run a modern operating system, cannot run a browser like the one you're using to read this and cannot even play an mp3 song at the same time.
However, it could very easily run a version of Windows and Word or Excel and serve as a fine office computer. Such CPUs serve even today in devices you can't even imagine. Trust me when I say, a 486 is a powerful CPU, a lot more powerful than any of the CPUs of all the 'home' computers we had in the '80s as kids (Spectrums, Commodores, Ataris or Amigas).
Still, today it's going to be a terminal.
Today, even a 'stupid' cellphone is based on a processor much more powerful than the lowly 486, much smaller and consuming a tiny fraction of the 486's power.
Today, we're surrounded by incredible processing power all around. Think of cellphones, tablets, netbooks, notebooks, PCs, smart TVs, but also things you don't think about, like your wireless router, your fancy new washing machine or even your new car with its GPS, ABS, anti-spin and parking system. All this computing power which, a few years ago could only be dreamt of, today just goes unnoticed.

We are living in a science fiction world of only 20 or 30 years past, guys.
Think about that and look around you... you're surrounded by yesteryear's magic...

8 March 2012

AMD ditches SOI

AMD, a long-time proponent of Silicon-On-Insulator, has announced that "on 28nm, all our products will be bulk". That comment was spoken by Thomas Seifert, AMD's CFO.





While AMD and IBM, amongst others, have been actively developing and - of course - using SOI for more than a decade, Intel has been adamant in its refusal to utilise SOI for mainstream chips - processorsr and chipsets; however, they very recently demonstrated SOI chips used for silicon photonics. They also had no plans on adopting SOI for their mainstream.

Yeah, but why?

The main lesson here is that underdogs must follow the trends of the market. Not that SOI technology is inferior. AMD has been using SOI whilst they had their own fabs, hadn't bought ATi and weren't making any very high performance chips in bulk silicon.
Now, they not only have ATi on board, they're also producing integrated CPU-GPU hybrids and on top of that, they've sold (well, spun-off) their fabs and are proceeding to distance themselves from any one foundry.
They didn't dump SOI because of some inherent technology inferiority, but because they can't find fabs to make chips in SOI. Right now, AMD is using both GlobalFoundries - their former fab spin-off - and TSMC, the biggest independent foundry right now, which does not have SOI facilities.
For obvious reasons, they can't afford to be designing and producing highly complex chips (i.e. CPUs or GPUs) in both bulk and SOI, simply because it's double the design time, double the debugging time and double the cost. They have to select one process and stick with that.
The choice of bulk silicon over SOI was, in one more way, unavoidable. What is now called AMD's graphics department, i.e. the former ATi division, had always been "fabless" and as such, had no expertise designing on SOI. Since the now unified company has to adhere to a single fabrication technology, the choice was between either switching the graphics division to SOI, or the CPU division to bulk silicon. Since right now, the graphics division is the one making headlines - or should I say *still* making headlines - for AMD, losing precious time switching them to SOI would be a mistake.
Moreover, SOI may be profitable for high-end products, where there's a sizable profit margin, but it's not exactly the money-maker, where budget designs are concerned. Right now, AMD's CPU division is 'trapped' in the "below high-end" market layers, but also, their graphics division may be doing extremely well on the high-end layer of the GPU market, the main bulk of its sales, however, reside in the same mid-range and below layers. Put simply, the SOI benefit would become a financial burden.

Fabs, on the other hand, don't invest in SOI because smaller outfits don't design for SOI, so they can't get clients to fill up their capacity. Only GloFo already has that technology and huge experience (as far as major foundries are concerned), but their only clients are AMD and outsourcing IBM.

So, is SOI dead? Actually, in the course of time, SOI has proven itself to be very interesting technology. Its merits are indisputable and there is reason to believe that, in some way, we haven't seen the last of that technology. Abandoning SOI for AMD is a strategic move, not a concession of defeat. Maybe when silicon photonics become prevalent, or maybe if (when) the bulk silicon technology hits an evolutionary "wall", foundries will turn to SOI to harvest the last possible shreds of performance and optimisation that that technology has to offer. For the time being, it'll be gradually limited to very specific audiences, much like the older Silicon-On-Sapphire, germanium or gallium arsenide and the very new graphene, molybdenite and black diamond substrates.