Archive

Archive for November, 2008

Fedora 10 Review – Desktop Emphasis

Introduction

Since the days of 10.1, I have been an openSUSE fan. I have been trying out new distros regularly, finding some to be very good, but always in the end rolling back to SUSE for one reason or another.

And then came Fedora 10: I wanted to see if it could finally win me over.

System

I am testing on the same system as before: an Acer Ferrari 5005WLMi laptop (AMD Turion 64 X2 2.0Ghz, 2.0GB RAM, 160MB hard drive, ATI Radeon X1600 Mobility)

This is a rather regular specification set, shared by many others around the world.

First Impressions – Live Run and Install

It was a simple matter to get the Fedora 10 GNOME live CD ISO from the Fedora site, and burn it off.

Booting it up live, I noticed a very cranky progress bar that led to the loading of the desktop, it looked much like a DOS load bar of yore.

But then, that is perhaps best leading up to a surprise: the beautifully done GNOME desktop. After logging into the live session, what you get is artwork that is unparalleled in mainstream distros (my opinion, obviously). If only they had done something about those antiquated icons too…

Fedora 10 Screenie 1

Leaving the goodies till later, I went for the install. It was a live installer with unique features: almost exactly like the ones we see everywhere else. It took about 10 minutes to install. Then I rebooted.

Booting up

For this Fedora release, what had been hyped was the high speed Plymouth graphical booting mechanism. In fact, the boot was actually slower than my Ubuntu 8.10 or SUSE 11.0. But while you wait, you cannot help but notice the brilliant, animated, blue solar theme.

Once loaded, it asks you to make a username, and finally logs you into the GNOME desktop.

Stability

In all my time using mainstream Linux distros, the stability of this Fedora 10 release nearly tops the list. I have been using it for over two days now, and I have failed to crash it even once. And this phase of testing includes very horrible stress tests, opening many things at once, randomly clicking around, opening conflicting programs, playing movies etc.

Server-like stability on a system used as a desktop? What more could I want?

The graphics surprise

On most distros, my graphics card is not configured for 3D effects out of the box. Fedora 10 not only picked the right resolution, but actually allowed me to turn on Compiz effects straight after installation. Not that Compiz effects mean much to me, but they mean a lot to other people, and besides, this is a good way of really testing out if a distro has your hardware right.

This was a fairly pleasant surprise.

Software collection

One of my objections to Fedora: they try to stay on the cutting edge, but omit many apps in their primary disk images. Abiword is the only Office application, requiring you install the massive OpenOffice.org over the internet. This one of the few qualms I have against Fedora 10.

Fedora includes GIMP and Rhythmbox. It also includes the Empathy messenger instead of the Pidgin client.

Fedora 10 Screenie 2

Package Management

Given that Fedora is so conservative in its software collection, one must install many apps and codecs (in particular) from the internet. Here, the PackageKit does fairly well – though I miss the power of the YaST visual package manager of SUSE.

Final Words

I like Fedora 10 very much. I can find practically nothing to really complain about. It looks good, works well and keeps working. And most of it out of the box.

On the negative side, the conservative package selection really bugs me – this is definitely not a recommended install for those with dialup connections. Moreover, there is a lack of really visible innovations – under the hood innovations are good, but a desktop user needs to be able to notice them.

Essential for those already on Fedora, recommended for people wanting a really stable, mainstream desktop linux.

Rating: 8/10

Google's SearchWiki – it's not all that bad

26 November 2008 2 comments

Ever since the introduction of Google’s SearchWiki for iGoogle users, there has been a lot of hue and cry about cluttering, spamming and what not.

What is worse is that many people have been griping about the non-availability of an opt-out option.

The truth is that SearchWiki isn’t all that bad. You can simply turn off the visibility of the pubic comments (this option is available at the end of each search page). What this does is that everything goes back to normal – except the small green boxes in front of the link. While this does not bring back the full pristineness of the Google search results, this still results in a generally non-impeded page.

Actually, SearchWiki is useful sometimes… When looking for particular keywords, you can see which link is most useful of all. While there is occasional spam, many good comments make it easier to choose the best link in the very first page of results.

It is, as always, another of Google’s experiments. The only surprise here is that it happened with Google’s primary cash cow: search, whereas almost all of their other recent experiments have been with other less important things.

Nevertheless, Google’s SearchWiki seems to be here for good – Google is not talking about repealing it. In the meantime, you can simple remove the public comments (and visually filter out the faded green boxes) or disable it completely.

Either ways, I find the SearchWiki to be not all that bad.

Categories: news, websites Tags: , , ,

How to speed up a computer as it gets slow over time

22 November 2008 2 comments

Introduction

It’s true, computers do get slower over time, specially if they’re heavily used. There are two major reasons for that: hardware and software.

But hardware is much less of a problem. Most computer hardware nowadays is solid state, which means there are few moving parts that slow down from wear and tear. The hard disk is perhaps the only common exception to this rule, but even they are reliable enough to last years without slowing. Dust accumulations inside the computer casing, and layering on the motherboard and processor may also slow down computing to some extent, but very little. Besides, it takes just one wipe with a soft cloth before you are back.

However, on the software end, things are different (particularly if you are using Microsoft Windows: Linux is generally immune to many of the following stated effects).

Software Effects

As a computer gets used, much software is installed. Much of it may default to starting up on boot time, thus occupying precious memory resources and CU cycles. Other software clutters up the Windows Registry. Yet more software is ‘clunky’: doing just fine, but doing so in a dirty, messing, hogging way.

These effects just accumulate over time. And it doesn’t help that Windows itself fragments itself, hogging up space in “Temporary Files”, and cluttering things everywhere. Windows is not much of a housekeeper either: it has no great tools to keep these things in check.

Something to remember…

While you certainly cannot get your ‘brand new’ performance back if you intend to keep your computer in production state, you can maximize the performance as far as possible. The reason is that many of your useful programs do need to reside in the system memory all the time, make registry entries and take up disk space. That is not avoidable if you wish to to do something on your PC

There are many commercial software applications out there that claim to return to your system to the performance it gave when it was new. Take that with a boxful of salt: it cannot be done.

Moreover, there are plenty of ‘suites’ that claim to optimize performance in one-click. Be wary of these too: you can easily get a malfunction system on you hands by trusting these, oft-aggressive, utilities. Use specialist tools for each problem. Many good ones are available for free.

The ‘How to’

  1. You need to make sure your system is malware free. That means no viruses, no worms, no spyware or adware. If you had no defences already, get your self a copy of the some antivirus, and run a virus scan. Also get copies of anti-spyware and anti-adware programs, and run their scans too. Many of the best companies offer 30 day trials you can use for this purpose. My favourites are Eset NOD32, Webroot Spy Sweeper and Lavasoft AdAware. Temporarily uninstall these protection softwares after the scans.
  2. The first step is then to clean out all the junk from your system. These are the files that may once have been created by Windows, or some program, but were left there to rot. There are plenty of tools available for this task, but my favourite freeware is CCleaner.
  3. Windows uses a Registry for many functions. This gets cluttered up over time. Use a specialized registry cleaner to clear the registry of invalid and leftover entries. If your selected tool has options to specify the aggressiveness of the cleaning, always select the most conservative one to avoid problems. I use EasyCleaner.
  4. As you install many softwares over time, some start having their modules open up with the Windows start up. While some may be useful for you, others may not. To begin, look at the taskbar notification area. Make sure that there is nothing there you do not want or do not use. If there is, open up that application, and see if there is a setting that allows you to remove it from the Windows start up queue. Then check the “Startup” folder in the Start Menu: anything in this automatically loads at boot time.
  5. Some software load up on boot, but do not show up anywhere and just run quietly in the background. For XP, Start Menu > Run… > enter ‘msconfig’. For Vista, just enter ‘msconfig’ in the Start Bar. Very carefully navigate to the ‘Startup’ tab, and remove the things with tick marks, that you do not want. And if you are feeling brave, you may also want to check out the ‘Services’ tab. However, for either of those: a word of caution – Be Careful, and Be Sure Before You Remove Something!
  6. Download the free JKDefrag, and run it. This is one of the most effective disk defragmentation programs ever, free or not. And it runs from a simple executable, so makes life so much easier.
  7. Reboot your computer at least twice in a row.
  8. Invest in a good Security Suite if you didn’t already have one. My recommendation is Eset Smart Security, due to the unparalleled antimalware protection it offers. Another good choice may be Norton Internet Security. These suites may actually reduce performance a bit, but that is nothing compared to the speed you will lose if malware gets on your system.

This step-through is by no means exhaustive, but if you have not maintained your PC in a long time, this will certainly give a very noticeable boost in performance.

Any other suggestions / comment from my readers will also be welcome.

Google's spread into competitors' domain

It seems just like everyday that Google is bringing up new products and ideas. And more often than not, we hear that it has to do with something that the other Tech Giants of the world are already kings in. That means Microsoft, Apple, Intel, etc.

Google is spreading beyond search, that is obvious. The question of ‘why’ has the very simple answer: diversification. All companies of repute try to diversify the range of items, products and ideas they vend, so that in case any one sector is hit by some calamity, the overall picture remains bright.

And Google has plenty of cash to to do this. Their search advertising still generates the lion’s share of their income (and is still their only major focus), and they are looking for avenues to supplement it, or spread into other areas not related to search. The problem is when Google steps on other toes when doing that.

Almost all Google online products hit Microsoft MSN. Android is a possible hit to the Apple iPhone in coming times.

But Microsoft’s online operations remain hardest hit. Google Search, Google Docs, cloud computing, browsers and nearly everything else they offer is a (better) rival to something Microsoft offers.

What interests me most is: where is this going to go?

An expanded Google Docs may conceivably challenge Microsoft’s major Office Suite. And Google’s foray into cloud computing in recent times could infringe on the kingdom presently reserved for Microsoft’s flagship Windows Operating System.

How will Microsoft, without the genius of Bill Gates at the helm, compete in the near future? Or will they be trampled in by the ever innovating Google? I predict the outcome of this battle in ten year’s time. Let’s see.

Anonymity services, what they are, and why you may want them

17 November 2008 2 comments

My readers may have noted a recent drying up of posts – well, that is mainly because I have been very busy. I am still a university student, and that takes up a lot of my time.

Anyhow, I am back. I bring to the board a review of some anonymity services – services that allow you to browse the web, and maybe even download, anonymously.

Do note, that this is written for beginners, so I have not attempted to remain 100% accurate in terms of terminology.

Finally, note that this post is link free. I do not provide the direct links to any services, because in my experience, these links change often. Please Google for them by name.

Why you may want them:

  • Mainly, for privacy
  • For browsing legitimate sites that may be blocked by your internet provider / country
  • For testing
  • For secure transfer of data over the internet

What they are:

These services use renowned third party servers to anonymise you. What happens is that your data is sent over the internet to these special servers, which then forwards your request to the website you wanted to visit.

To the website, and to everyone else in the loop, it seems as if the request is coming from that specific server, not from you. These anonymity servers promise to keep your data transmission private, and whether you trust them enough is entirely up to you.

The kinds:

There are two kinds I classify.

  1. Web based services that allow anonymous browsing from a browser
  2. Server Client services that you have to install on your system, that allow anonymisation of all outgoing internet communication

How the web based kind works:

  1. You open a website. These are often called CGI prock-sy, or PHP prock-sy sites. Try Googling the the two terms.
  2. You enter the address of the website you want to browser anonymously.

How the Server Client kind works:

  1. You install a software on your system
  2. You configure it.
  3. You configure your browser or other internet communicating software to use it. This is often by applying a ‘localhost’ prock-sy to that software.
  4. You can now use the browser etc. as normal, except that all of the communication is now anonymised.

I will not provide here a list of the web based prock-sies, because a single Google search shows up plenty of them.

Other services, include:

  • JAP (JonDo)
  • I2P
  • Your Freedom
  • Tor

Most of these are free, at least, for slow speeds. Once again, Googling these names will bring up the direct links.

I have found JAP to be the best of all these services in terms of speed and convenience. For most flexibility, use Your Freedom. For greatest privacy, use Tor.

Latest AV Comparatives (On-Demand) results – is NOD32 still the best?

The latest set of results from the well known AV-Comparatives is the set of on-demand scanning results. These results show NOD to have a lower than the top rating. The best possible rating is “Advanced+”. NOD32 v3 got “Advanced” instead.

This result was released a good month or so ago. Why I did not comment on it earlier was because I was doing repeating few tests of my own, very similar to the unorthodox tests I did earlier. I wanted to see if NOD had indeed fallen in capability. And though On-Demand scanning has never been NOD’s main strength, it was still surprising that malware was getting through its net where as my results had shown otherwise.

My methods remained the same: install them on some node in my universities malware ridden LAN and see which one brought the best results. Once again, my friends’ PCs were used.

In short, I still gained the same results. For me, NOD was still performing the best, followed at a short distance by Kaspersky. The versions tested were the latest, as of 1/11/08. Other products I tested were AVIRA (the latest to top AV-Comparatives’ charts, which I did not last time around) and Norton. AVIRA lived up to its billing as a powerful and light antivirus, but gave too many false positives, once picking up a harmless Microsoft Word file as a trojan. Norton gave a highly improved experience (which is surprising since there was no major version update), letting only one virus through.

Note that my testing is mainly reliant on pro-active defence. This automatically biases it in favour of NOD32 and Kaspersky, whose heuristics are well acknowledged. After all, if nothing gets through, why should one need to scan?

One point to note: NOD32 has lesser false positives than most of the others.

Endnote

AV-Comparatives’ results show NOD to be poor(er) than expected for the latest set of comparatives, while Norton is clearly a high performer.

But for me, and real-world conditions, NOD32 still tops the range, with Kaspersky on second position.

Technology: a stronghold of Consumerism

For those who are not sure about consumerism, I quote the Wikipedia definition: “Consumerism is the equation of personal happiness with the purchase of material possessions and consumption.”

Recently, while looking at the stats for my blog, I was struck by a very unexpected trend. I noted that those of my articles, which focus on companies, or which at least seem to have buying tips (like my last article) registered spikes in the number of hits from web searches, and feed viewers, as well as syndication services like Blogburst. The only exception to this rule were perhaps the Linux reviews I posted, those too are discounted from this trend breakage because they were featured on the honourable DistroWatch.com.

This is disturbing: it seems that most tech searches lie in the consumerist domain. People are not interested in the tech, but what they can buy from it.

I would love if my readers commented on this.

Categories: pleasantries Tags:

AMD and Intel (and Nvidia) – How Things Stand

Much has been written about how Intel has trumped over AMD cleanly in recent times, with it’s Dual and Quad Core processor (Santa Rosa, Penryn, Nehalem, etc).

Well, to begin with, do no take this article as review. This is just a opinion, a presentation of kinds. This article will not help you buy a processor – this is not a buying guide. This is simply for reading pleasure.

Intel’s processors are good. Almost all current reviews rate Intel processors over AMD in terms of general benchmarks. Even in terms of performance per unit money, many now call the balance in favour of Intel, particularly in the Dual Core domain.

However, things are not as simple as they seem. Read my full case against benchmarking here, or read on for a short recap:

… benchmarking does not make sense for regular desktop use. I am not interested in the speed score of the computer I am going to buy (or already have, but want to compare), I want to know if it gets the things I want done faster than others that compete. And in that kind of test, I have often found benchmarks to be awful liars: my laptop does the exact same sequence of Photoshop processes, video conversions, Matlab compilations and webserving faster than another computer with 3DMark, PCMark scores considerably than my own…

…benchmarks may often be optimized for one specific kind of computer hardware. For example, nowadays, Intel processors of the Core 2 Duo breed are said to be triumphing over the AMD 64 X2 processors – based on benchmarks. But what if the benchmark algorithms are more optimized for Intel processors? They would then obviously be faster on the Intel! Even more complicated, I do know of a few benchmark results that run faster on an AMD Turion X2 than a Core 2 Duo Mobile of the same specifications, while other tests run faster on the Intel. Is it fair to say that the Intel is faster simply because it tops more benchmarks? After all, it all depends upon the hardware environment each processor is set in…

…One benchmark may test the processor for super number crunching, whereas that computer may normally be used for, say, multimedia. Another benchmark may test multitasking, when the computer’s main use may be Adobe Photoshop. That is where the problem lies: benchmarks are often portrayed – and accepted – as absolute ranks, when they are almost always relative to task specific requirement…

This, then, is where problems come in: how do you test a processor for power? Truth is, that may actually be impossible without actually having the exact requirements, and the final, optimized hardware build for that processor.

And that is where AMD is still holding fort: there are benchmarks that AMD still tops, particularly in the domain of super fast linear computing.

People have even begun to claim that AMD is going down the drain, while Intel has become dominant. I am inclined to agree. AMD, while having a history of comebacks, is simply cash-strapped. They have recently been forced to split their company into two, a section each for design and fabrication. That is an ode to how powerful Intel actually is. And while we may not want a competition less Intel (that would send prices skywards), it does prove that Intel is the strongest player in the game.

However, AMD never ceases to surprise. Their most recent shot was not in the processor industry, but the graphics: after years of yielding their graphics card market to Nvidia, they suddenly bounced back recently with the ATI (a.k.a AMD) Radeon HD 4870 X2 graphics card, and blasted apart the integrated graphics chipset market with the 780G.

And Intel may soon be in line: courtesy, the AMD Fusion Processor with integrated GPU. Intel has no real plans to counter such an innovation, to the best of my knowledge.

Last Words

So. As things stand, AMD is bumped and Intel is flying. But this is a phase that AMD may come out of in the near future, since the company restructuring. Moreover, AMD’s innovation (and Intel’s lack of imagination) may once again put AMD in a position to attack. For users, that will mean another glorious era of high speeds and low prices.

More Ad Experiments

The most observant of my regular readers will note that today, my ad setup has changed once again.

This post is just a notification. Any complaints with the new ads will be welcomed.

I have shifted to Direct Rightmedia, and Adify. So far, the highest grossing was made by Direct Rightmedia, so that stays. Bidvertiser went out, as did Chitika.

I may update this post in a day or two.

Faster than light – Impossible, even if Sci-Fi says so

Once upon a time, I was watching some movies, and I realized how many sci-fi movies rely on the concept of faster than light travel. This was many years ago (I was hardly into high school then). I wrote, at that time, a small article about why faster than light travel is impossible. While there are many better and interesting (and more accurate) explanations that I could now provide (some are presented on my other site, Super Physics), I feel that the old simple explanation still stands. So I just thought I should post it, for lack of anything better to do this Saturday…

“There was once a lady named Conway,
Who traveled faster than light,
She departed one day, in a relative way,
And returned on the previous night.”

Those of us who watch Star Trek, or are interested in Sci-Fi are already familiar with the concept of starships traveling faster then the speed of light, which is approximately 300,000,000 m/s, the fastest speed currently known to man kind. Although the concept is amazing, one must wonder which energy source would power such a ship. To answer the question certain facts must be considered:

  • All the energy in the universe is not sufficient to propel even an electron to the speed of light.
  • Objects get heavier as they approach the speed of light, eventually they become infinitely dense.

The solution offered by many theoretical physicists is simple, just beyond the light barrier consists a field of super-cooled space-time energy called the ‘Speed Force”. This energy is unlimited, and if harnessed properly it is enough to propel matter beyond the speed of light. As far as density problems are concerned, new theories suggest that when objects are traveling at ½ or ¾ of the speed of the speed of light, they enter a space-time distortion field, and to some extend they no longer exist in our universe. Properties such as friction do not exist and laws and equations of motion no longer apply. Other wise an electron accelerated to .999999999999999999 times the speed of light would hit you with the impact of a truck! Another interesting concept is the slowing down of clocks, at half the speed of light clocks would slow down by 15%, at the speed of light clocks would completely stop. Even on the space shuttles we have today, clocks tick less than one ten-millionth of a percent slower than their counterparts on Earth. Sadly the solution is to build clocks in an atmosphere that is traveling at the needed speed.

Another mind boggling concept is that pocket universes can be created if one collides with the centre of the universe at the speed of light, the collision creates bubbles of super-cooled space-time that evolve into universes. Some say that at the point of formation, God can be found. Religious scholars regard this as heresy and physicists as concepts impossible to prove with experiments or equations.

Categories: pleasantries, science Tags: , ,