Picture This.

Transcript of podcast below.

This year is the 25th anniversary of a software product known and loved by photographers and designers the world over: Adobe Photoshop. Photoshop enables skilled practitioners to perform simple colour corrections or radical alterations to content and composition. Photoshop and other photographic processing software completely changed the way photographers and designers work, to the point where Photoshop has become an almost essential component of most professional photographers toolkits.

But that same ability to digitally manipulate photographs brings with it a Pandora’s box of problems. Fashion models have their bodies digitally sculpted into more attractive forms, causing many to doubt the reality of a lot of fashion pictures. Whilst female fashion models, and stars appear to be digitally manipulated far more often than we’d like, it’s not just fashion models who experience a little creative re-modelling. National Geographic was hoodwinked in 2010, with a faked picture of a dog.

Now it’s true that some digitally faked pictures can be easy to spot but the are an awful lot of pictures out there that may be almost impossible to spot digital manipulation in, unless we were there when the picture was taken. It’s at this point I begin to wonder what all the fuss is about, after all if someone takes a picture of a bunch of daffodils and turns them into a glowing yellow ball of sunshine, why not? If the picture appeals, where’s the harm?

To a point there probably isn’t harm but what about pictures used in news stories? Is there an absolute guarantee that they’re not digitally manipulated? There are rules that news agencies impose on photographers, but rules get broken. A particularly famous example was a shot of president Obama standing on a beach, head down, during the US gulf oil spill. The dramatic picture of a troubled president, standing alone grabbed a lot of attention. The problem was that he wasn’t alone, two companions were edited out of the shot to make it more dramatic, and appealing to the audience.

The difficulty we have is that we live in an image and media hungry world, where web and social media sites post increasing numbers of pictures and may not have the time, or inclination to verify the authenticity of an image. In some cases an enhanced photograph might be no big deal, but in others it may completely change the emotional impact, or sense of a serious news story.

Whilst Photoshop itself is not the bad guy in this story, the temptation for photographers to tweak and change their photograph, to make it more dramatic will always be there. All we can do as consumers is keep a sceptical eye on the big picture: in a digital age anything can be manipulated, including us, the audience.

A Voyager of Discovery

Around 38 years ago NASA launched a space probe called Voyager in 1977, before most students, and quite a few staff at UCS, were born. I’ll come back to the significance of that point later.

The original concept of Voyager was to take advantage of an alignment of the planets in the solar system, which takes place once every 175 years. Voyager and a sister vessel would fly a pre-planned path in order to survey the planets Jupiter, Saturn, Uranus, Neptune and Pluto. Voyager 2 was intended to survey most of the planets, whilst Voyager 1 was intended to survey one of Saturn’s moons, Titan.

Both spacecraft completed their original missions extremely successfully, much to the delight of NASA engineers and scientists. The Voyager probes sent back quite astonishing pictures of the planets in our solar system, at a level of detail that people had never been able to capture before. But the real surprise hasn’t been the pictures and unparalleled information that’s been gained, it’s how long both probes have lasted.

NASA engineers had projected an expected lifetime for both probes at around 5 years. They’d planned for the long term though, fitting plutonium batteries, and as robust a computer control and management system as they could engineer. Even with the most robust engineering they could produce, no-one expected the probes to last as long as they did, and because they’ve lasted so long they’ve run into a few new problems, that they’d never envisaged.

You see 38 years ago programmers created the management system for Voyager using a fairly arcane programming language, by modern standards. NASA is finding it quite challenging to find software engineers who want to maintain Voyager with such an old language, and on such ancient (by modern standards) computer hardware. The problem arose when the last of the original programming team retired. I guess he deserved to at the grand age of 80.

Voyager might be a unique example of the kind of problems older technology can throw up, but the same parallels exist in modern businesses, who routinely spend millions of pounds to keep their systems up to date, lest they end up as old as Voyagers.

You see whilst technology marches on, there’s a hidden cost: you’ve got to keep it up to date, or you’ll end up losing control over the data, information or system simply because of its age.

A Piece of Cake

Transcript of podcast below:

What’s the connection between cakes and supercomputers? To answer the question we need to go back to 1947.

In 1947 J Lyons and Co were a huge British catering company, feeding the nation with cakes, bread, pies, coffee, ice cream and tea. They had a problem though: their accountants estimated that they only made a quarter of a penny on every meal, or cake, sold. How could they make their business more profitable?

In search of an answer Lyons dispatched two senior managers to America to investigate a new phenomenon: the computer. They came back from their trip convinced that Lyons needed a computer of their own. Back then all computers were built by hand, as one off items.

To obtain a computer of their own Lyons approached Cambridge University and agreed to fund the completion of a computer project called “EDSAC”, for the princely sum of £3,000. If the project was a success, Lyons planned to buy and commission a similar model for their business. After some trials at Cambridge, Lyons was convinced they were onto a winner and commissioned an EDSAC clone, which they dubbed “Leo”.

They set Leo to work in their head office. Its principal job was to collate orders from their customers, and work out the manufacturing schedule. The boost to efficiency was so great that Leo was soon put to work calculating payroll and eventually inventory management.

Lyons started renting computer time to other companies, including Ford motor company and they ended up spinning off a new computer business. Sadly for Lyons making cakes and computers didn’t mix as well as they could have done, and something had to give. The fledgling computer business was eventually sold off, presumably with a free bag of Lyons cakes to tempt bidders.

The story didn’t end there though, the Lyon’s computer business was bought up by the English Electric Company, who in turn became part of the ICL conglomerate which eventually acquired by the Japanese computer giant Fujitsu.

If you stood one of Fujitsu’s super computers alongside Leo it would be like comparing Stone Age man to modern man. Where once Leo calculated payroll and cake orders, Fujitsu’s supercomputers churn out gene sequences, weather predictions and render movies. In fact the number of tasks that computers aren’t involved in is becoming vanishingly small.

In the end what drove the evolution of computers was simple: business competition. When once upon a time Lyons owned a computer, and rented it to others, now almost every business owns one or more computers, ranging from the PC, all the way up to the high performance computer arrays.

All of these computes are bent to one aim: efficiency and ultimately profit and it’s worth bearing in mind that a significant slice of UK computing history started, with just a piece of cake.

Reaching for the Stars

Transcript of podcast below.

Once upon a time in a technical universe, far, far away reviews of establishments like restaurants were published in newspapers and magazines. Restaurants had to be pretty successful to even get reviewed, and even if they were a great success they wouldn’t be reviewed that often. However the day of the restaurant critic were numbered: the Internet brought with it the capacity to instantly review anything at all.

When that feature arrived crowds of people piled in with their view of their local establishments. Web sites like Trip Advisor and Zagat soon became the place to find out what others thought of the place you were planning to eat at.

Now everything could have been rosy, and at the beginning it probably was but pretty soon the roses started wilting. What happened? The first thing that happened were negative reviews. Negative reviews can have a profound impact on small businesses, with some claims being made that a single negative review could shut a small business down.

Small businesses felt that they were being blackmailed by customers, with the threat of a negative review. Some businesses even went as far as suing negative reviewers for libel. To say that a gulf existed between consumer and supplier was an understatement. Then the suppliers began to bend the rules, and make them work in their favour.

Companies would ask friends, and soon professional copy writers, to post positive reviews of their establishment, or service. This fake review process has got so bad that various government appointed bodies are looking into the matter.

As the old saying goes two wrongs don’t make a right but the use of review as a commercial tool has been a long established practice. What the Internet has done is simply magnify the problem, and along the way created an “astroturfing” industry that churns out fake reviews for a price.

The practice of creating and posting fake reviews, by any business, is illegal however whilst it might be illegal there’s not much chance of a company ending up in court. The bodies responsible for doing so are, depending on the case, either the Competition and Markets Authority or the Chartered Trading Standards Institute . These authorities are spread pretty thin, and the Internet is a vast, sprawling metropolis of web sites, which cannot be effectively policed by any human agency without significant computer assistance.

If a company is taken to court, presumably because they were reported to the relevant body the penalties are clear: two years in prison, and an unlimited fine for breaches.

Still even with the threat of unlimited fines, companies routinely break the law, thinking themselves immune from any consequences. All we can do as consumers is exercise our judgement when looking at reviews, and hopefully reporting anything that looks suspicious to the relevant government body.

The Need for Speeed

Transcript of podcast below.

The cloud is an all encompassing term used to describe a range of products and services that are, more or less, universally accessible. The question I’d like to pose today is this: what powers the cloud?

One answer might be the vast array of computers housed in expensive data centres across the planet. Another might lie somewhere within the very large companies that have a vested interest in getting as big a share of your custom as possible, via cloud services.

A much simpler answer is speed powers the cloud. The cloud wouldn’t exist without high speed Internet connections. At the turn of the 21st century Internet connections speeds were nothing to write home about and slow connections meant that Internet services were more limited in scope, and restricted in range. After all what’s the point of having all your pictures stored in the cloud, if you can never access them because your Internet connection is too slow.

These days higher connection speeds mean that we can access a much larger set of goods and services than we ever could before. Live streaming of music and video is a commonplace occurrence, although that does depend on the speed of your Internet connection and the type of media you’re listening to, or watching.

But not everything is rosy in the garden. Mobile devices still lack universally fast Internet connectivity, and that has a knock on effect on mobile access to cloud services, which can lack zip and appeal if your mobile device suddenly decides it doesn’t have a turbo charged Internet connection to hand.

The same limitations apply to free Wi-Fi and Internet services. If lots of people access the same free service the immediate consequence is that the speed of that service declines, sometimes quite sharply. As consumers of free Wi-Fi and other access services, there’s not a lot we can do to speed things up, except hurry up and wait.

In time mobile communications will become a lot faster but that has to be balanced against the ever increasing demands for speed that new cloud services might impose. I’m not sure that we’ll see fast mobile Internet connections everywhere in the UK but I’m sure that mobile service providers are working on it.

At some point in the future the need for speed may cease to be an issue but something tells me that there will always be one more service that requires us to push the Internet accelerator a little bit harder than before.