WHAT HAVE I LEARNT?

I had no idea what Digital Cultures would be about when I first started my course back in September 2009. The first few lessons focussed on what blogging is about and how the internet phenomenon started. Two names are most memorable from the beginning of the year - Heather Dooce and Sir Tim Burners-Lee.

We later looked at technological convergence and if technology has changed news delivery. The answer was a resounding yes; and Apple's iPhone helped me come to a sensible conclusion. We then looked at whether cultural imagination forms the future, with the console generation helping me to understand these elements well. In later lessons we looked at monopolies, when I researched Google and found out some very interesting information. I also looked into the worst and best user interfaces, although perhaps I was a little biased towards Apple?

Computer topics surrounded later weeks when we discussed whether computers will surpass the human mind, if virtual worlds are immersive, if Web 2.0 is addictive, and the possibility of the Sony Reader killing traditional books. These were all interesting areas, mainly because they are topics that are progressing and growing in popularity rapidly.

I got more personal in later postings with my rant about how terrible Microsoft's Internet Explorer is, and whether Rockstar are the pioneers of virtual worlds with their Grand Theft Auto series. Further lessons discussed more serious issues about whether personal privacy is rapidly declining, if intellectual property is respected, and if the UK Government will eventually replicate a Big Brother society.

Later topics related to the free aspects of the internet, when I blogged about whether open source software should be pre-installed, if a semantic web is a good idea, and the issues surrounding net neutrality.

Nearing the end of the lessons we talked about more economic aspects such as The Long Tail and the Digital Divide. These were definitely business related topics, with digital elements to them.

The final task of the Digital Cultures course was to discuss four topics in a completely independent manner. The first was a topic of interest that was not covered in detail during the taught lessons, when I decided to blog about the developing world of eGovernments; the second was a topic where your original thoughts have developed or changed, with my personal views on open source meeting that criteria; the third was a discussion regarding the future of Digital Cultures; and the fourth was a reflection on what you have learnt throughout the entire year (this specific blog post).

"The Digital Revolution: The World at the Click of a Button"
Fernanda Romano (International Trade Forum - Issue 3 - Published 2009)

So, to conclude, I have learnt a whole wealth of aspects regarding Digital Cultures worldwide. From the internet to daily life, the Digital Revolution is having a huge impact on our daily lives.

Posted on 4/17/2010 by JUDICIOUS JOE and filed under | 0 Comments »

THE FUTURE OF DIGITAL CULTURES

It's easy to analyse past events in technology, but rather difficult to predict future events. In just ten years technological achievements have been incredible.

We've seen games consoles launch a whole new market, sometimes making similar amounts of money as popular international films. We've seen computers go from being rare in homes to become as common and widespread as televisions; now being regarded as an essential ornament in the home by many. However, I am not writing this post to discuss the past, only the future. So what does the future really hold for Digital Cultures?

Well, I believe personal surveillance by governments is going to be a big issue in the future; with the amount of information they could potentially know about us being a frightening prospect. I also believe the keyboard and mouse may be consigned to the bin, with touch screens being widely adopted by many manufacturers. It wouldn't surprise me if laptops, desktops and mobile phones all use some form of touch screen elements in ten years time.

Device storage and processors could also progress to have unprecedented specifications. Traditional hard drives will be phased out and replaced by solid state alternatives. Dual Core and Quad Core processors will be regarded as slow, with six and eight core implementations becoming the industry standard. Whether Microsoft will hold such a high market share will also be interesting. I would like to see Apple's Macintosh brand accelerate in growth, and not just be regarded as good for video, music and design and nothing else - after all people can't use the no right click argument anymore!

"It's a peculiar feature of this technology that by making things smaller everything gets better. The transistors get faster, you can put more of a system on a chip."
Dr Gordon Moore (Inventor of Moore's Law)

So, judging my Moore's Law, the future of Digital Cultures looks positive. However, if people abuse the power of advancements in technology, such as Governments, then the consequences could be a very scary prospect indeed.

Posted on 4/15/2010 by JUDICIOUS JOE and filed under | 0 Comments »

FURTHER VIEWS ON OPEN SOURCE SOFTWARE

I wrote a post back in February about whether open source software should come pre-installed on both Microsoft's and Apple's operating systems. As I have used open source software more and more over a period of weeks my views have changed slightly.

I use an Apple MacBook Pro with BootCamp running Windows 7 on a partition of my hard drive. To be honest, I am an Apple fan, and therefore rarely use Windows 7 as I prefer Snow Leopard.

When using Snow Leopard, it surprised me when I thought about my most used applications. I use Google Chrome as my primary web browser of choice, and personally believe it blows Firefox, Opera and Safari out of the water; don't get me started with my views on Internet Explorer! For media playback I use VLC Media Player, due to it's extensive file format support, and therefore rarely use Apple's QuickTime Player or iTunes. For FTP management I use FileZilla, an extremely powerful client that puts many similar paid software alternatives to shame. For document work I use NeoOffice, a great Mac alternative to both Microsoft Office and Apple's iWork suite. It does the job fine, and doesn't cost the astronomical amount that Microsoft charge. For my archiving tasks I use The Unarchiver; powerful extraction software that is better than Apple's bundled efforts.

"I'm a huge supporter of the free software movement."
Gregory Papadopoulos (Sun Microsystems)

So, to be honest, I use open source software for the majority of my computing tasks everyday. Well known corporation efforts just aren't as good in my opinion. Therefore, my original view of open source software being pre-installed from an out-of-the-box state remains the same, and stronger now than ever!

Posted on 4/14/2010 by JUDICIOUS JOE and filed under | 0 Comments »

WHAT IS AN ELECTRONIC GOVERNMENT?

Becoming an electronic Government (shortened as eGovernment) means to embrace the digital environment and to enable a comfortable, transparent and cheap interaction between a government and its citizens and businesses. The eGovernment process can be broken down in to four categories:
  • Government-to-Consumer (G2C)
  • Government-to-Business (G2B)
  • Government-to-Government (G2G)
  • Government-to-Employees (G2E)
These four seperate domains all play varying roles in terms of Government activity. Aspects such as public holidays, public hearing schedules, online polling, campaigning, lodging tax returns and applying for services and grants are just some of the many examples.

An eGovernment doesn't refer to just the internet platform; other technologies such as SMS Text Messaging, MMS, CCTV, biometric identification and identity cards are some of the other methods commonly used in various countries. In the case of the UK Government, these methods are already being implemented.

"The public services portal Directgov is the major single access point for eGovernment services to citizens. Beyond the actual services offered, the portal also contains comprehensive information on a broad spectrum of fields making thus navigation within further websites unnecessary."
Digital Britain Final Report (Published by the UK Government)

In my opinion, an eGovernment is something the general public must accept. Websites such as Directgov are useful for a wealth of information. However, continual web surveillance and CCTV coverage is becoming so excessive and vast that general privacy is starting to become impossible.

Posted on 4/13/2010 by JUDICIOUS JOE and filed under | 0 Comments »

WHAT IS THE DIGITAL DIVIDE?

The digital divide is a term describing people with and without access to digital information technology, and is often a factor of the rich and the poor; the young and the old.

People in some countries simply don't have access to digital information technology, and therefore become automatically alienated in many ways. Some people believe that basic telecommunication services such as the internet are a fundamental right, and nobody should be deprived of it. I agree, but the internet is not important in the same ways as drinking water, food and shelter.

"People lack many things: jobs, shelter, food, health care and drinkable water. Today, being cut off from basic telecommunications services is a hardship almost as acute as these other deprivations, and may indeed reduce the chances of finding remedies to them."
Kofi Anan (UN Secretary General)

Gordon Brown recently announced that the digital divide can be prevented by the installation of superfast broadband. He claimed that high speed internet will save the government billions and revolutionise how people access public services. I think they need to develop easy to use public websites, rather than increase data speeds, to make the online procedures less tedious.

I, personally, don't agree with the UK government trying to digitise every daily task. It's the elderly that suffer the most, with computers often baffling them, and quite frankly it's not fair. Yes, the government should digitise the majority of essential services, to make the computer literate users life easier. However, they should not phase out traditional methods, such as paper forms for instance. The digital divide will disappear once the generation that has grown up with computers since childhood enter their retirement years. This is my personal view, and probably applies to the majority of governments around the world.

The digital divide will not be eliminated for at least 40 years, in the UK at least. With the continual decline in computer hardware costs, poorer countries may soon be able to develop wired towns and cities, but this is at least a few decades away yet.

Posted on 3/23/2010 by JUDICIOUS JOE and filed under | 0 Comments »

WHAT IS THE LONG TAIL?

Why are online retailers one step ahead of traditional retail stores? Simple, because they have virtually unlimited shelf space. Other advantages include the extensive lists of similar items or recommendations that a normal shop worker would find impossible to remember. One company that puts these aspects into practice extremely well is Amazon.

The Long Tail graph was developed by Wired Magazine Editor Chris Anderson, later releasing a book titled: 'The Long Tail - Why the Future of Business Is Selling Less of More.' The whole principle is about fewer products selling in large quantities verses more products selling in lower quantities. Chris Anderson is therefore suggesting that the future of business is to stock as many products as humanly possible in large warehouses.

Chris Anderson suggests that companies such as Google and Apple can mine both the head and the tail. They have the assets and resources to operate monopoly businesses. Take Apple's iTunes Store for instance, with a catalogue of more than 11 million songs. Put that into perspective, a regular HMV shop would have to be monumentally huge to store that amount of records. However, the cost of storing and distributing digital files is extremely minimal, and the chance to profit from back catalogues becomes a distinct possibility.

"We sold more books today that didn't sell at all yesterday than we sold today of all the books that did sell yesterday"
Amazon Employee (Describing the Long Tail)

It seems to me that The Long Tail applies more to the digital age of the internet than the traditional high street. If The Long Tail approach is widely accepted by the majority of businesses in the future, could high street stores start collapsing? After all, wasn't Woolworth's killed off by the internet and supermarkets? It leaves me wondering if the high street will soon become a row of warehouses? With the recent rumours of Amazon planning to open high street collection stores, it could soon become a reality.

Posted on 3/19/2010 by JUDICIOUS JOE and filed under | 0 Comments »

WHAT IS NET NEUTRALITY?

The primary objective of the Net Neutrality movement is to keep the internet open and accessible to all users, application providers and network carriers. Basically, the internet should not be interfered with by Internet Service Providers (ISPs); with specific websites or applications not being favoured over others.

Internet Service Providers can easily detect what customers use their connections for. They can discover if you regularly participate in online gaming, or use peer-to-peer programs such as BitTorrent. Subsequently, they can traffic shape your connection, directing different speeds or bandwidth to various applications. For instance, online gaming might be permitted, while peer-to-peer connections such as BitTorrent may be deliberately slowed down.

The whole point of shaping traffic is to stop users clogging up the internet, therefore preventing other users connections from becoming slow. Some people believe ISPs limit certain applications that threaten their own businesses. Take Skype as a prime example, which allows free VOIP (Voice Over Internet Protocol) phone calls. Companies such as Sky and TalkTalk are threatened by these sorts of applications, and therefore may wish to alter their traffic allowance. This type of activity is also known as throttling, when the advertised network speed is not met. If new Net Neutrality laws were implemented this activity would be deemed illegal.

"To seize this moment we have to ensure free and full exchange of information, and that starts with an open internet. I will take a back seat to no one in my commitment to network neutrality, because once providers start to privilege some applications or websites over others then the smaller voices get squeezed out and we all lose. The internet is perhaps the most open network in history and we have to keep it that way."
Barack Obama (November 14, 2007 - Mountain View, California)

Net neutrality is an essential aspect in my opinion, and benefits the internet in the same way open source benefits software. With Google and Barack Obama on board, it can't be long before new laws are implemented. The only aspect consumers will be concerned about is whether the cost of Net Neutrality will be passed onto them.

Posted on 3/09/2010 by JUDICIOUS JOE and filed under | 0 Comments »

IS A SEMANTIC WEB A GOOD IDEA?

Humans often use the internet to carry out everyday activites, such as purchasing numerous items or researching various topics. However, a computer is not able to complete the same tasks, due to the fact the language of web pages can only be understood by people, and not a computer. The whole principle of a Semantic Web is to make computers understand web pages the same as humans. This therefore allows boring and tedious tasks to be completed by the computer and not the human, obviously allowing more time for the user to be more productive with applications they enjoy. Sir Tim Burners-Lee defines a Semantic Web as a Web 3.0 aspect that will make the World Wide Web better.

Now, lets put all of this information into a practical example. Say you've got you're bank statements online, along with some photographs and appointments in a calendar. Wouldn't it be great to have your photos embedded into a calendar so you could recognise when they were captured, along with important bank statements attached also? Well, at the moment, that isn't possible. It's simply because each element is stored within it's own stand alone application or server, where it's kept to itself. This collaboration of data would be made possible by a web of data from a Semantic Web.

"I have a dream for the Web [in which computers] become capable of analyzing all the data on the Web – the content, links, and transactions between people and computers. A ‘Semantic Web’, which should make this possible, has yet to emerge, but when it does, the day-to-day mechanisms of trade, bureaucracy and our daily lives will be handled by machines talking to machines. The ‘intelligent agents’ people have touted for ages will finally materialize." Sir Tim Burners-Lee (1999)

Metadata is an important element that makes web pages more relevant, and therefore helps benefit the proposals of a Semantic Web. Accurate metadata is all well and good, and beneficial to internet users, but Metacrap is a problematic issue.

Metacrap is when metadata is not used with care, and was defined by Cory Doctorow in 2001 as the seven insurmountable obstacles to reliable metadata:
      • People lie (obstacle #1)
      • People are lazy (obstacle #2)
      • People are stupid (obstacle #3)
      • Mission Impossible: know thyself (obstacle #4)
      • Schemas aren't neutral (obstacle #5)
      • Metrics influence results (obstacle #6)
      • There's more than one way to describe something (obstacle #7)
      Therefore, a semantic web requires a massive collaboration between a vast array of internet users, organisations and companies. Web 2.0 was a drastic step up from Web 1.0, and Web 3.0 looks set to be another massive leap forward.

      Click here to view the University of Bedfordshire Wiki article about the Semantic Web.


      Posted on 3/02/2010 by JUDICIOUS JOE and filed under | 0 Comments »

      OPEN SOURCE SOFTWARE TO BE PRE INSTALLED?

      Most operating systems come with pre-installed software, sometimes officially made by the vendors. Take for instance Windows 7, with Microsoft including programs such as Windows Media Player and Internet Explorer. Open source programs such as VLC Media Player and Mozilla Firefox surpass Microsoft's efforts by miles. The same could be said about Apple's Snow Leopard, where QuickTime is easily surpassed by VLC Media Player's extensive file format support.

      This leads onto my question; should open source software come pre-installed? Mozilla's Firefox has hugely dented Internet Explorer's market share. Why is this? It is undoubtedly due to many aspects: it has a cleaner Graphical User Interface, is faster, has useful plug-ins, is highly personaliseable, is secure, and most importantly complies to web standards. Mozilla's Firefox scores 96/100 in the Acid3 test, Microsoft's forthcoming Internet Explorer 9 scores a miserable 32/100. A prime example of how open source can be better than large corporation efforts.

      Microsoft come under pressure last year to not include Internet Explorer within Windows 7. The European Commission deemed the inclusion of Internet Explorer as anti-competitive browser behaviour. They wanted users to have the ability to choose their own preferred browser, rather than have Microsoft's efforts forced upon them. This was later ruled out however, and Windows 7 shipped in Europe with Internet Explorer 8 pre-installed as originally planned.

      There are hundreds of open source programs that are better than huge corporation efforts. As mentioned previously, Mozilla Firefox is the internet browser of choice, Mozilla Thunderbird for email and VLC Media Player for entertainment (I am yet to come across a file it won't play). There are numerous others, but it would take a significant amount of time to list them all. The quote below summarises open source software perfectly.

      "The availability of the source code and the right to modify it is very important. It enables the unlimited tuning and improvement of a software product. It also makes it possible to port the code to new hardware, to adapt it to changing conditions and to reach a detailed understanding of how the system works. This is why many experts are reaching the conclusion that to really extend the lifetime of an application, it must be available in source form. In fact, no binary-only application more than 10 years old now survives in unmodified form, while several open source software systems from the 1980's are still in widespread use (although in many cases conveniently adapted to new environments). Source code availability also makes it much easier to isolate bugs, and (for a programmer) to fix them."
      Jesus M. Gonzalez-Barahona (Advantages of Open Source Software - Published 2000)

      I don't see Microsoft or Apple bundling open source software that rivals their own creations anytime soon. For the moment it remains the user who has to download all their preferred programs. However, could manufacturers of PC's soon place open source software on thier machines, similarly to the free Anti-Virus trials many offer at the moment? It seems more likely to happen with Microsoft PC's than with Apple computers, due to the fact Apple build both the hardware and software of every Mac!

      Posted on 2/24/2010 by JUDICIOUS JOE and filed under | 0 Comments »

      WILL THE GOVERNMENT REPLICATE BIG BROTHER?

      An Internet Service Provider (ISP) is what every household must use to get connected to the internet. The majority of ISP companies offer various packages and speeds.

      Since the implementation of broadband, data speeds have gradually become faster. The days of dial-up modems are quickly becoming a thing of the past. We are now in an era of Wi-Fi internet, where 8mbps connections are deemed as a satisfactory speed.

      Every website you visit is logged by your computer, unless you enable private browsing modes available in internet browsers such as Mozilla Firefox and Google Chrome. That's fine, your computer enables you to build up an archive so that you never forget the good websites you've visited.

      However, the UK Government is implementing some controversial new legislations, that will affect every computer user in Britain. All telecommunication companies and Internet Service Providers (ISP's) are required to keep a record of every customers personal communications by law. This includes all of the websites you visit, and who you have contacted; with specific details of where and when, for a period of twelve months.

      This new law increases the amount of personal data that can be obtained via the Regulation of Investigatory Powers Act (RIPA); which was only supposed to be used for terrorism purposes.

      This new legislation is known as the Intercept Modernisation Programme. It forces every company to effectively monitor and archive every online mouse click, in a Big Brother fashion.

      Companies such as BT, Orange and Vodafone will be involved in storing the data, with a monumental cost of £10billion over a ten year period; paid for by the tax payer of course.

      "Whilst this is no doubt necessary in persuing terrorist suspects, the proposals are so intrusive that they should be subject to legal approval, and should not be available except in pursuit of the most serious crimes."
      David Davis (former Shadow Home Secretary)

      Ministers originally wanted to store information on a single government run database, but decided not to due to privacy concerns. However, they are pressing on ahead with the privately held databases, and a Big Brother society is becoming ever closer.

      These new laws will have a profound impact on Internet Service Providers (ISP's). For example, if 20 billion emails are sent via the internet each year, and each email is only 2 kilobytes in size (mainly text), and the ISP is forced to store all messages for twelve months, then approximately 7000 terabytes of storage is needed worldwide. This was discovered via research from IDC in 2001. The majority of emails today are rarely 2 kilobytes in size, in fact most are now measured in megabytes. Storage costs money, and the consumer will have to foot the increased bill.

      You could argue that the government is implementing a perfectly legitimate law; after all if you've got nothing to hide what's the problem? However, how would you feel if authorities knew extensive amounts of information about your life, who's to say data will only be used for criminal investigations?

      Posted on 2/16/2010 by JUDICIOUS JOE and filed under | 0 Comments »

      IS INTELLECTUAL PROPERTY RESPECTED?

      These days, new and updated electronics are appearing every week. The majority of products are patented upon release, or have patents pending.

      Companies that implement innovative ideas deserve the right for other manufacturers to be forbidden from copying their intellectual property. That's a brief explanation of what a patent is, and if any company infringes it they risk being sued.

      The Apple iPhone was revolutionary when it was released back in June 2007. It changed the industry from tactile buttons to touch screens. The Graphical User Interface is simply beautiful, adopting its looks from its big brother 'Snow Leopard'. The iPhone made every other manufacturer rush to compete; with some aspects difficult to be replicated. For instance, Apple's multi-touch capacitive screen technology still remains the best on any phone, with unprecedented accuracy, and subsequently patented in January 2009.

      However, Apple isn't as squeaky clean as some people may believe. Nokia recently announced they wish to sue Apple for infringing patents on mobile phone technology. They accused Apple of "trying to get a free ride on the back of Nokia's innovation." Allegedly, there are ten patent infringements, which include wireless data, speech coding, security and encryption features of the iPhone. Nokia have agreements with approximately 40 manufacturers, permitting them to use their technology; with Apple not signing any agreement. A cheeky move by Apple, seeing as Nokia have invested £36.2 billion on research and development during the last two decades. No doubt both companies will have elite lawyers working for them.

      "The basic principle in the mobile industry is that those companies who contribute in technology development to establish standards create intellectual property, which others then need to compensate for. Apple is also expected to follow this principle."
      Ilkka Rahnasto (Vice President of Legal and Intellectual Property - Nokia)

      It seems as though elite companies would rather take the risk of infringing patents than getting clearance prior to launching their products. Apple is a huge computer company, with large dominance in the portable market since the release of the iPod, iPhone and MacBook products.

      Posted on 2/09/2010 by JUDICIOUS JOE and filed under | 0 Comments »

      IS PERSONAL PRIVACY RAPIDLY DECLINING?

      For centuries, people have been obsessed with their privacy. It's a human interest to not want everything made public. Surveillance these days is everywhere. From CCTV cameras, to mobile phones (GPS tracking), to bank cards; the authorities have an accurate idea of your approximate location. In fact, it's mainly recent advancements in technology that's made the 'Big Brother' attitude possible.

      Since the dawn of the World Wide Web, privacy is rapidly becoming impossible. The majority of content on the internet is permanent, and rarely temporary. Digital content can spread rapidly within short periods of time. For instance, if you upload a home video to YouTube, you must be prepared for it to spread all over the internet; take Star Wars Kid as a prime example.

      Social networking websites such as Facebook also have issues with privacy. Some status updates are very personal, previously being something only your diary would know; but now digitised for all your friends to read. Ever been tagged in a photo you don't like? Most people have, so you just remove your tag don't you? The problem is the photo is only removed from your profile, not your friends', and therefore remains on the internet for people to happily view. Issues that would never of existed a decade ago are now an everyday occurrence. Some people are too personal with their status updates, consequently losing their jobs or risking their safety. Unsurprisingly, there is a master password for Facebook accounts, for any authority to browse if you are under suspicion.

      Every Google search you make is recorded for two years. What you search for is often personal, and authorities can easily establish what kind of person you are. Google is regarded as the king of the internet, with most people using it to find and browse every topic and subsequent website in the world.

      Email can also be easily read by authorities. Some messages are often very private and confidential, that only you and your intended recipient should know about. However, you simply have no choice, with messages read in the interest of public safety.

      "These days, if you feel like somebody's watching you, you might be right."
      Mann, Nolan & Wellman (Sousveillance - Published 2002)

      In the new digital age, you have to accept that authorities know scary amounts of information about you. The only invention that retains some of your privacy is the door, invented during Egyptian times. Could we soon be part of a super database, or are we already on one?

      Posted on 2/04/2010 by JUDICIOUS JOE and filed under | 0 Comments »

      ARE ROCKSTAR THE PIONEERS OF VIRTUAL WORLDS?

      Rockstar Games are a hugely successful company, with the Grand Theft Auto series selling a total of 70 million copies worldwide, and ranked third in the Guinness top fifty games of all time list.

      The franchise was originally created by David Jones, a Scottish games programmer, with help from English brothers Dan Houser and Sam Houser; and game designer Zachary Clarke. The gameplay contains a mixture of elements including action, adventure, driving and stealth. The Grand Theft Auto series began in 1997, running to the present day, covering four console generations.

      The game places you in an open environment, where you must complete tasks and objectives in order to climb the ranks of the criminal world. Subsequently, you can complete tasks in any order you wish, playing casually or more seriously. It's these elements that give the series so much replay value, whilst making it incredibly addictive.

      The major breakthrough title was undoubtedly Grand Theft Auto III, a revolutionary game when it was released in October 2001. It was the first 3D open world title, as opposed to the birds eye camera view of previous games. The most notable later releases included the following:
      • Grand Theft Auto: Vice City (released October 27, 2002)
      • Grand Theft Auto: San Andreas (released October 26, 2004)
      • Grand Theft Auto IV (released April 29, 2008)
      The plots of the GTA series are also spectacular, with famous actors voicing characters in the games. Tommy Vercetti from Vice City is voiced by Ray Liotta, whilst Frank Tenpenny from San Andreas is voiced by Samuel L Jackson.

      However, Steven Poole's thoughts on video game incoherence are definitely present in all of Rockstar's Grand Theft Auto titles. Take for instance incoherence of causality, where aspects are not the same as in real life; present in all GTA titles due to the ability to carry numerous heavy weaponry at once. Incoherence of space is also a problem, where expectations of actions are not met; present in Vice City where Tommy Vercetti is unable to swim (even through shallow water).

      "I could go on and on about why Grand Theft Auto IV is one of the best games we've ever seen and why even folks who are easily offended should play it, but that would be pointless. The only thing you need to know is that you have to play this game. Period."
      Hilary Goldstein (IGN Review)

      It's not opinion, but rather fact that GTA IV is the best game of the seventh generation console era. Metacritic rates both the PlayStation 3 and Xbox 360 versions of the game 98/100, the number one game on both consoles. No other games developer comes close to Rockstar's open world creations; and with expectations so high for GTA V, Rockstar's next installment in the series needs to be a masterpiece.

      Posted on 1/27/2010 by JUDICIOUS JOE and filed under | 0 Comments »

      IS MICROSOFT'S INTERNET EXPLORER DOOMED?

      Microsoft's internet browser, known as Internet Explorer; sometimes abbreviated as 'IE', has come under criticism recently for having serious security vulnerabilities. This is not good news for Microsoft, as other browsers on the market are capable alternatives.

      It all started when Google email accounts were hacked in China via an exploited security loophole within Internet Explorer 6. Human rights activists were targeted, with Google later threatening to pull out of China completely.

      Google is a huge organisation, and definitely has the ability to dent Microsoft's reputation. However, matters soon became worse, with both the German and French governments warning citizens against using IE and to find an alternative browser. The UK Government is supporting Microsoft however, not issuing any warnings, and claiming there is minimal risk.

      The two most viable alternatives are Mozilla's Firefox and Google's Chrome. Both are fantastic web browsers, and completely wipe the floor with IE. They are both faster, more reliable and comply to relevant web standards (such as Acid3). Other rival software includes Apple's Safari and Opera. Obviously the security vulnerability affects PC users more, as Internet Explorer for Mac was discontinued in 2003.

      Now, you'd think the security issues would only affect version 6 of IE, but no; researchers have developed code that can exploit the same vulnerability within Internet Explorer 7. Microsoft say they are working hard to release a patch to solve the issues, but have no specific release date. 

      "As a web developer I can honestly say that banning IE altogether would be the best thing ever to happen in my career. People who use it deserve to be hacked as far as I'm concerned."
      Matt (The Telegraph - Comments Section)

      Even though Google have been hacked, they must be sitting with a smug grin on their face, as thousands of users now flock to alternative browsers.

      Personally, I now use Google's Chrome all the time, on both PC and Mac. It rarely crashes, has a simple but innovative GUI, and is extremely fast. Firefox is equally as good, and surpasses Internet Explorer in almost every aspect. Software updates are also very frequent and therefore both browsers are improving all the time.

      Internet Explorer is the most popular browser in the world; peaking at 95% usage share in 2004, and since declining to 62%. It could soon see its user base drastically fall, with Firefox currently occupying 24%, Chrome 5%, Safari 4% and Opera 1%.

      Posted on 1/19/2010 by JUDICIOUS JOE and filed under | 0 Comments »

      WILL THE SONY READER KILL TRADITIONAL BOOKS?

      The traditional book is a fantastic technology; being durable, bounded and fixed. The ancient Romans originally invented the book, placing sheets of paper between wooden slabs. Despite all the recent advancements in technology, the book is still regarded as an excellent resource.

      However, recently the Amazon Kindle and Sony Reader have been released, aiming to revolutionise the book industry. The Sony Reader features an electronic paper display with 5", 6" or 7" screen sizes. You can purchase books (known as eBooks) from various online stores, whilst reading personal documents, PDF files, blogs and RSS news feeds. In essence, a multimedia experience, whilst retaining the natural display of a book by appearing as ordinary ink on paper.

      "You will often hear it said that the print medium is a doomed and outdated technology, a mere curiosity of bygone days destined soon to be consigned forever to those dusty unattended museums we now call libraries."
      Robert Coover (New York Times - Published 1992)

      Personally, I don't believe these devices pose any real threat to traditional books. One obvious deterrent is cost, with the cheapest Sony Reader model costing £150. The main advantage is undoubtedly the ability to store approximately 350 books on one device, therefore a great space saver and convenient whilst commuting. To be honest I would feel less nervous throwing a traditional book into my bag than I would a Sony Reader!

      However, this year is when everything looks set to change with Apple due to release it's iPad. The iPod Touch has been a huge success, with a touch screen no other manufacturer can match. Essentially, that is what the iPad is, a larger, more powerful iPod Touch, with a 9.7" screen size. The iPad will pose a massive threat to the traditional book, due to its innovative multimedia capabilities and powerful hardware. Numerous newspaper companies are already working to digitise their publications specifically for the iPad.

      To conclude, the traditional book is under no great threat at the moment, with the Sony Reader too new to show any major market share. However, Apple's iPad is what could save the declining magazine and newspaper industries, turning the market around and starting another digital revolution.

      Posted on 1/13/2010 by JUDICIOUS JOE and filed under | 0 Comments »

      IS WEB 2.0 ADDICTIVE?

      Since the introduction of Web 2.0 Standards at the annual Web 2.0 Conference in San Francisco, websites have adopted numerous features to survive the rigorous dotcom market.

      Companies such as eBay (founded in 1995) have incorporated new web standards to remain the market leader in online auctions. Simplicity is what the user wants, and the easier a website is to use the more they will come back time and time again. The introduction of eBay User Feedback is a perfect example of Web 2.0, allowing you to see who are trusted sellers and who to avoid doing business with.

      "Invariably, Web 2.0 is a term you love to hate or hate to love but either way, you'll know you'll get folks attention by saying it."
      Dion Hinchcliffe

      The recent launch of Facebook has shown how Web 2.0 can become a major social success. Set up in 2005, the website has rapidly increased to a community of 350 million users. In essence, its a sophisticated electronic diary, allowing you to share content with friends. However, Facebook is criticised for its Privacy Policy, with some believing its used for surveillance and data mining.

      Some companies, such as Amazon, log what products you browse and purchase. Subsequently, this allows them to show reccomendations based on it's database; a very clever business model. Furthermore, Amazon is such a joy to browse and purchase from that it's placed many book shops on the high street into crisis.

      So, is Web 2.0 addictive? In circumstances such as Facebook the answer is a profound yes. The younger generation spend extremely long hours online, and Facebook has now become both an addiction and concern for some. However good technology becomes in the future, such as Web 3.0, you will never beat the physical and verbal interactions in real life.

      Posted on 12/15/2009 by JUDICIOUS JOE and filed under | 0 Comments »

      ARE VIRTUAL WORLDS IMMERSIVE?

      Computer games are now a normal aspect of everyday life; but are virtual worlds really immersive, and not just entertaining?

      The first major computer game that demonstrated how open worlds could work was Grand Theft Auto III. Rockstar, the developers, moved dramatically away from previous 2D incarnations into a large scale 3D world. The player could roam the fictional map of Liberty City similarly to real life, choosing how to complete goals and objectives. As years passed, no other game could match the overall GTA experience. Rockstar later released Vice City, San Andreas and GTA IV, the latter regarded as the best game of all time.

      However, computer games such as GTA have come under criticism recently for encouraging violence. One teenage boy shot two police officers and a dispatcher to death in 2003, apparently depicting acts from the game.

      Not all virtual worlds are violent, however, with some perfectly suitable for young audiences. Need for Speed Most Wanted is a good example, allowing you to freely roam a virtual city with high performance cars, whilst trying to become the most notorious street racer. The cars were even laser scanned to enable maximum realism.

      Another interesting virtual world is Second Life, a method of interacting with each other over the internet with avatars (a virtual character). The possibilities are profound; you can build realistic structures and replicas, and engage in normal real life activities. Similarly to GTA, Second Life has also come under criticism, with some strange occurrences.

      "It may have started online but it existed entirely in the real world and it hurts just as much."
      Amy Tayler

      Overall, you could suggest virtual worlds are immersive environments. However, I believe its down to your personality and interests, and your state of well being. If you're depressed in real life but happy in Second Life, the virtual world may cheer you up during your short period of immersion. Some people take Second Life way too seriously; should virtual activities really affect real life relationships?

      Posted on 12/08/2009 by JUDICIOUS JOE and filed under | 0 Comments »

      WILL COMPUTERS SURPASS THE HUMAN MIND?

      Computers have come a long way in such a short space of time. Random Access Memory (RAM) used to be measured in kilobytes (KB), but has now evolved into gigabytes (GB). Hard Disk Drive's (HDD's) are much the same story; changing from megabytes (MB) to gigabytes (GB), or even terabytes (TB). Take a look at this chart to get your head around the byte terminology.

      It's the main computers brain (the processor), however, that has dramatically become more complex and sophisticated. Take the first ever Apple Macintosh as an example, which had an 8 MHz Motorola 68000 microprocessor with 128 KB of RAM. That in today's standards is extremely slow; to prove a point lets compare it with the latest Apple iMac. With a 2.8 GHz Quad Core processor running on 16GB of RAM, the new iMac provides extreme computing power.

      Now, do all these developments mean computers are becoming gradually closer to surpassing the human mind? In 1997, Garry Kasperov took part in an experiment by IBM to beat it's chess playing computer named 'Deep Blue'. The machine won two games, lost one and drew three, therefore proving more intelligent than the human counterpart (who was the reigning world champion). This was a glimpse of how machines could easily become more superior than the human race.

      "A machine that can think remains the dream; and it's still many years and quite a few startling breakthroughs away."
      Computer History Museum

      Numerous films have also showcased how artificial intelligence and robotics could go wrong. A fantastic example is I, Robot. This demonstrates how robots could have emotions and turn against humanity even when programmed with the 'Three Laws of Robotics' written by Isaac Asimov.

      However, modern day computers are useless without a form of operating system. Turn on any computer with a blank hard drive and nothing will happen. In essence, computers can only become more intelligent with programming or development by human beings. Therefore, if acting carelessly, we could cause one of mankind's creations to turn against us; possibly resulting in huge consequences.

      Posted on 12/01/2009 by JUDICIOUS JOE and filed under | 0 Comments »

      WHAT ARE THE WORST & BEST USER INTERFACES?

      A user interface is a method of interacting with a particular machine, device or computer programme; in other words human-computer interaction.

      The Xerox Alto was the first computer to use a graphical user interface (GUI) in 1973. It was revolutionary and inspired Steve Jobs with the incarnation of the Macintosh in 1984. Mac OS version 1.0 was released on January 24, 1984.

      Meanwhile, Microsoft were also keen on the idea of a graphical user interface, wanting to move away from the confusion of the MS-DOS format. They launched Microsoft Windows version 1.0 on 20 November 1985. Since then, both Apple and Microsoft have strived to make their operating systems more intuitive and reliable.

      Unlike Microsoft, Apple labels their operating systems as decimalised numbers. Mac OS has gone through ten incarnations: 1, 2, 3, 4, 5, 6, 7, 8, 9 and the current 10 series (currently on version 10.6.2). Interestingly, Apple label their major software releases after big cats, and refer to their current series as Mac OS X (Roman numerals):
      • 10.0 Cheetah (released March 24, 2001)
      • 10.1 Puma (released September 25, 2001)
      • 10.2 Jaguar (released August 24, 2002)
      • 10.3 Panther (released October 24, 2003)
      • 10.4 Tiger (released April 29, 2005)
      • 10.5 Leopard (released October 26, 2007)
      • 10.6 Snow Leopard (released August 28, 2009)
      All of Apple's releases have featured notable changes and improvements, except Snow Leopard; which was more of a refined Leopard than radical transition that occurred from Tiger to Leopard. Microsoft on the other hand are inconsistent with their operating system names. The most notable releases include the following:
      • Windows 95 (released August 24, 1995)
      • Windows 98 (released June 25, 1998)
      • Windows XP (released October 25, 2001)
      • Windows Vista (released January 30, 2007)
      • Windows 7 (released October 22, 2009)
      Windows 95 was a revolutionary step up from Windows 3.1, targeting the mass consumer market; Windows 98 was more about refinements and tweaks; Windows XP featured a massive redesign, and adoption of digital media; Windows Vista was a major visual change of the GUI, and a flawed disaster; and Windows 7 is a fined tuned version of Windows Vista, regarded as the best ever operating system Microsoft have produced.

      So what are the worst user interfaces? An obvious choice, Windows Vista. It was incredibly bloated, slow, full of bugs, and riddled with incompatibilities. It is now improved after two service packs, but was rushed and should never have been released as early as it was. Actually, I can't name one Microsoft operating system that hasn't crashed numerous times!

      "Any operating system that provokes a campaign for it's predecessor's reintroduction deserves to be classed as terrible technology."
      Nate Lanxson (Windows Vista)

      So, what are the best user interfaces? My personal favourite is Apple's Snow Leopard. It is beautiful to look at, fast, reliable, and incredibly intuitive. In fact Snow Leopard is so good it makes you look at Windows in a different light; picking out every possible fault you can.

      Posted on 11/24/2009 by JUDICIOUS JOE and filed under | 0 Comments »

      IS GOOGLE A MONOPOLY?

      A monopoly is acknowledged by the UK government when a company reaches 25% market share. Google completely obliterates this figure, accounting for 65.1% of all internet searches in the USA. This is more than triple the share of rival Yahoo, and more than nine times that of Microsoft.

      Google was originally founded on September 4, 1998 by Larry Page and Sergey Brin whilst they were studying at Stanford University. The main objective was "to organise the worlds information and make it universally accessible to all."

      Since then Google has rapidly grown in size and popularity. The company now has 19,786 employees, processes over 1 petabye of data on their servers every hour, and is the most visited website in the world. In fact Google has had such a profound impact on society it has now become a verb.

      However, Google has come under fire during recent years. In 2007 they were criticised for placing a cookie on users computers that tracked search history and didn't expire until 2038; this was later changed and now lasts for 2 years. Privacy International is concerned about how the data of millions of internet users is easily accessible by governments if requested. This was demonstrated recently when academic researches uncovered someone's identity just by her internet searches.

      "Cyberspace can be seen as the new bomb, a pacific blaze that will project the imprint of our disembodied selves on the walls of eternity."
      Nicole Stenger

      Essentially the more you use Google, the more accurate its databases become. You are feeding the monster information, often in an extensive and regular manner. Other search engines fight to better Google's ideas, with most either trailing or failing. Google looks certain to continue growing for many years to come, and is undoubtedly the search engine monopoly.

      Posted on 11/17/2009 by JUDICIOUS JOE and filed under | 0 Comments »

      DOES CULTURAL IMAGINATION FORM THE FUTURE?

      A great deal of technology that we use today was only dreamed about decades ago. Take for instance games consoles, starting out with basic 2D graphics; such as the Magnavox Odyssey from 1972 (first generation). No one could even imagine ultra realistic 3D environments, adapting and changing in real time; now possible on consoles such as Sony's PlayStation 3 (seventh generation).

      This relates to the question, as games designers need good imaginations to draw up and create their ideas; along with implementing new features made possible with more powerful hardware. Take FIFA Football for example, which Electronic Arts have developed since 1993, and is now highly regarded by critics as the best virtual game of football.

      Culture is defined in the dictionary as: 'the sum total of ways of living built up by a group of human beings and transmitted from one generation to another.' This is a general definition, but it does apply to technological aspects of life.

      The cultural imagination of mobile phones has also started a revolution for people of all ages. They are now regarded as 'essential' with some people feeling incomplete without them.

      "To be happy in this world, first you need a cell [mobile] phone and then you need an airplane. Then you're truly wireless."
      Ted Turner

      Therefore cultural imagination does form the future, and will continue to do so for centuries to come. A new idea will surpass the mobile phone craze in future years, it just seems impossible to imagine what at this present moment in time.


      Posted on 11/11/2009 by JUDICIOUS JOE and filed under | 0 Comments »

      HAS TECHNOLOGY CHANGED NEWS DELIVERY?

      Many transmission technologies have popped up during the years, each proving effective in the broadcasting of news.

      Take for instance radio and television, both everyday methods the media use to broadcast with. The first ‘true’ radio broadcast was made in 1915, transmitting from New York City to San Francisco, and Virginia to Paris. However, it was in 1933 that radio began to take off, with the introduction of Frequency Modulation (FM radio) by Edwin Howard Armstrong. It radically improved audio signals by controlling the noise static caused by the earth’s atmosphere and electrical equipment. Radio developed further during later years, with Bell Labs inventing the transistor in 1947 and Sony introducing the transistor radio in 1954. Now technologies such as DAB (Digital Audio Broadcasting) radio are used, but FM still proves a more effective and reliable platform, with not much audible difference.

      Television on the other hand, similarly to radio, was developed by numerous people to contribute to one idea: ‘an electron beam scanning a picture in horizontal lines.’ In 1939, commercial television was launched by David Sarnoff; the vice president of the Radio Corporation of America (RCA). Since then, television has undergone major changes; from black and white to colour, from analogue to digital, and from Cathode Ray Tube (CRT) to Liquid Crystal Display (LCD) and Organic Light Emitting Diode (OLED).

      Radio is an effective platform for keeping people up to date with the news while they are on the move. Every production car now has a radio as standard; in fact it’s a consumer essential. Commercial music stations usually broadcast news on the top of the hour, with some stations focussing on it for 24 hours a day (such as LBC for example). Television operates in a similar manner, with news usually being broadcast on the top of the hour, and some stations broadcasting dedicated 24 hour news coverage (such as BBC News and Sky News).

      The internet has also proved to be a fantastic platform for news. Websites have become more content rich as new web standards have been implemented, and therefore collate different media more effectively; such as text, photographs and video for instance. The internet also enables you to pick out content that you’re interested in, which not a lot of other media allows you to easily do. The BBC News website is a fine example of how a news website should look, attracting millions of visitors a day.

      Mobile phones are also now an emerging platform, with devices such as the iPhone leading the way. Smartphone’s can now load websites identically to a traditional computer, and with mobile applications such as Sky News and ITN, content can now be delivered as stories are breaking. You can even watch BBC iPlayer on some devices, streaming BBC News broadcasts.

      "In the twenty years Sky News has been with us the world has spun a few times and things have changed apace, most noticeably the technology now used to bring us the news." 
      Sir Michael Parkinson (Sky News: 20 Years of Breaking News - Published 2009)

      Now, onto the question as to whether technology has changed news delivery. The answer is an obvious yes. Without technological developments 24 hour television news would never have launched, and services such as BBC News Online wouldn't have soared in popularity. We are living in a new digital age, where people want access to news 24 hours a day, wherever they are in the world.

      Posted on 11/04/2009 by JUDICIOUS JOE and filed under | 0 Comments »

      IS TECHNOLOGICAL CONVERGENCE OCCURRING?

      Decades ago you used to buy gadgets that functioned well for different purposes; one unit did one task well and others not so well.

      Take the Sony Walkman for instance, the revolution of having music in your pocket wherever you went. The idea was by Nobutoshi Kihara, who wanted the ability to listen to operas whilst flying on transpacific plane trips. First launching in cassette format during 1979 it proved a big hit amongst consumers. The Walkman became so popular it changed with the times, with the CD Walkman (also known as the Discman), MiniDisc Walkman and MP3 Walkman popping up over the years. The brand is still going strong to the present day, and is now also implemented as software in Sony Ericsson phones.

      However, it is the iPod that has taken all the limelight recently, becoming one of the most famous inventions of the 21st century. The concept of a hard drive based device (and flash storage recently), with the capability of storing thousands of tracks has become one of the most popular portable gadgets ever created.

      Now, onto the question as to whether technological convergence is occurring. The answer is an obvious yes; with the best example being the Apple iPhone. Launched in 2007, it has completely changed the mobile phone industry. It features an awesome 3.5 inch touch screen that works beautifully in virtually every aspect. So, what features has it got packed into its shell? Well, firstly its a mobile phone, an iPod, a camera, a video recorder, a GPS mapping device, an internet web browser and email client, a calculator, a voice recorder, and endless other features due to the App Store (featuring over 85,000 applications).

      Put that into perspective, you now don't need to carry the following with you:
      • iPod (for both audio and video playback)
      • Camera (for digital photography)
      • Camcorder (for video recording)
      • Satellite Navigation Device (for long car journeys)
      • Laptop (for internet browsing and email)
      • Calculator (for complicated calculations)
      • Dictaphone (for voice recordings)
      • Or even a builders level (the iHandy Level app turns your iPhone into one)
      The iPhone has made so many gadgets redundant, and is one of the best all-in-one devices on the market due to its intuitive and innovative operating system.

      "The iPhone is the most sophisticated, outlook-challenging piece of electronics to come along in years. It does so many things so well, and so pleasurably, that you tend to forgive its foibles."
      David Pogue

      An iPhone killer could well come along in future years, but for the moment it stands as one of the best devices of the 21st century selling over 20 million units. The recent decline of iPod sales (8% decrease) is partly due to the features of the iPhone, and the way it has given nearly every other mobile phone manufacturer a headache!


      Posted on 10/30/2009 by JUDICIOUS JOE and filed under | 0 Comments »

      WHO CREATED THE WORLD WIDE WEB?

      The World Wide Web is a huge virtual landscape with an endless amount of information and digital content. It’s now a normal aspect of everyday life, but how was it formed and who by?

      The internet was originally developed by DARPA (Defense Advanced Research Projects Agency) in 1964, with its focus to share research between various universities and defence facilities. The first messages were transmitted during 1969.

      The internet is simply a way of viewing files that have been placed onto a server; and due to the numerous protocols, no single person invented it. For instance, Leonard Kleinrock was the first to develop the idea of packet switching, a fundamental aspect in order for the internet to function; while TCP/IP was developed by Vint Cerf and Robert Khan in 1974.

      Perhaps the most significant development was by the British engineer Sir Tim Berners-Lee, who is regarded as the inventor of the World Wide Web and HTTP (Hyper Text Transfer Protocol); alongside developing HTML (Hyper Text Markup Language), and URL (Uniform Resource Locator). His first proposal was made in March 1989, with the first ever communication being made on 25 December 1990.

      “I just had to take the hypertext idea and connect it to the Transmission Control Protocol and domain name system ideas and - ta-da! - the World Wide Web.”
      Sir Tim Berners-Lee

      The internet is now hugely popular in almost every aspect; for business, for education and for social use. Websites are now content rich since the implementation of Web 2.0, with streaming video now becoming a reality with increased internet speeds. It’s also a 24 hour shopping centre, where you can compare prices to get the best deal. People now micro-blog about their lives via services such as Facebook and Twitter. Although the best website ever created is undoubtedly Google, the powerful and innovative search engine that helps you find what you want when you want, and is the most visited website in the world.

      The statistics speak for themselves; in December 1995, 16 million people used the internet (just 0.4% of the world population). However, as of June 2009, there are 1,669 million users, which is 24.7% of the world population.

      The World Wide Web is regarded by many as the best invention of the 20th Century, with The Telegraph ranking Sir Tim Berners-Lee number one in their list of 100 greatest living geniuses (published in 2007). It will continue to grow for decades to come, always innovating and bringing new content to the masses, with the next major development being internet connected televisions in living rooms across the globe.


      Posted on 10/29/2009 by JUDICIOUS JOE and filed under | 0 Comments »

      CAN BLOGGING START A CAREER?

      Blogging can be a useful tool for either personal or professional purposes. Take for instance my website which is a personal blog; compared with the BBC’s Business Editor Robert Peston’s, which is a professional outlook on world business developments. Whichever preference you choose, all of your information is accessible via the public domain.

      So, this leads onto the question as to whether blogging can start a career. In the case of Heather Armstrong it’s quite simply the opposite. She lost her job as a web designer because of writing offensive comments about her boss on her blog. In fact, she has now invented the term of being fired because of a bad digital footprint; becoming ‘Dooced’ with immediate effect; or as the Urban Dictionary describes it ‘to lose one’s job because of one’s website.’ However, she has learnt her lesson and now runs Dooce as a successful business in partnership with her husband, with a daily average of 55,000 readers.

      Another good example is Salam Pax, an Iraqi, who blogged about the Iraq war during its initiation in 2003. He provided regular news and updates from a civilian’s perspective, subsequently gaining media attention. Some of the posts he wrote were extremely descriptive, painting a very clear image. The blog was later published as a book in association with The Guardian. This is another strong example of how the internet can quickly propel a future career.

      "There were days when the Red Crescent was begging for volunteers to help in taking the bodies of dead people off the city street and bury them properly. The hospital grounds have been turned to burial grounds."
      Salam Pax

      In conclusion, you could suggest that a blog can either ruin or start a career. However, this is only based on the content you write, and as Heather Armstrong described “my advice to you is be ye not so stupid.”



      Posted on 10/28/2009 by JUDICIOUS JOE and filed under | 0 Comments »