All posts by John Herd

About John Herd

Employee

A sting in the Tail

Transcript of podcast below.

Film makers are gradually shifting to digital video because it’s flexible and cheap. I should re-phrase that last comment – it’s certainly flexible but is it cheap?

I was surprised to discover that making a digital film may cost about the same amount as making a conventional film. Now there is a healthy debate about which format costs more and many creative professionals opt for digital formats because of it’s undoubted flexibility, and depending on the director cost of the medium may be a secondary consideration.

But what happens to all this content once it’s been produced, the audience has watched it, enjoyed the popcorn and has left the cinema? Well that’s where there’s a sting in the tail for the film industry.

You see film and television companies know that a film, once made, may have a very long life after it’s initial showing. All the company has to do is to store the original master copy somewhere safe, bringing it out from time to time for a re-release or repeat showing.

And here’s where this story acquires a digital sting – it’s all in the storage. It’s been reported that digital content costs 12 times more to store than conventional film. It gets worse if the production is digital end to end, where it could be up to 200 times more expensive! So why the large difference in cost?

To store old-fashioned celluloid film all that’s needed is a nice cool storage space, some shelving and a label on the tin for identification. Digital film storage involves backup tape, hard drives or a combination of the two. It sounds pretty simple doesn’t it? But there’s a problem, over time digital content needs migration successor generations of storage media because digital media degrades at an alarming rate, compared to celluloid film (or for that matter paper). On top of that there is the danger of digital media becoming unreadable should data storage formats change, which they do more frequently than you’d think.

It’s an ironic thing isn’t it? The cost of making a digital production might be apparently cheap up front but can, in the case of feature films and big budget productions, become quite significant over the years. For example the cost of storing a digital feature film for a decade might be in the region of $2 million, whilst conventional film would cost about $10,000, for the same time period. The price difference is enough to make a feature film, and store it, if shot on conventional film.

If the cost of storing digital media doesn’t come down sharply, and there aren’t many signs of that happening at the moment for film, it’s possible that very large amounts of creative output of the 21st century will simply disappear. We may end up with more extant films from the 1950’s than 2015, and with it the cultural footprint of a generation could disappear like smoke in a breeze.

 

A Flash in the Pan

Transcript of podcast below.

In the last week of November 2015 Adobe pronounced the last rites over Adobe Flash, and no-one apparently noticed. In it’s day Flash was a revolutionary product but the Flash revolution got too big, too fast for anyone to control it.

Go back 10 years and you’d find that Flash powered YouTube, BBC’s then revolutionary iPlayer and a multitude of media heavy web sites. At the time Adobe made the bold claim, disputed by competitors, that Flash was responsible for 75% of all video content on the web. You don’t get much bigger than that.

Of course you can have too much of a good thing and that was certainly the case with Adobe’s Flash. It felt like everyone was using Flash to do the daftest of things, and at times, entirely inappropriate stuff that shouldn’t have made it onto any web site. Then again that’s the nature of the Internet, if it can be done, it will be done and done to death, and sometimes beyond that point too.

Such was the success of Adobe Flash that pretty much every computer system had to have it installed; otherwise large chunks of web content simply weren’t visible. The problem was that Flash was present on so many devices, in so many different variations and versions that hackers began to pay a lot of attention to its vulnerabilities. Their attention paid off, as they found a rich vein of exploitable computers out there that were running vulnerable versions of Flash.

Before anyone could say “quick as a flash” large numbers of computers were infected with computer viruses that exploited Flash vulnerabilities and the number of viruses that used Flash to exploit computers began to snowball. Internet users were left confused, they needed this software to watch and interact with media centric sites, but were becoming increasingly nervous about security problems associated with running Flash.

Something had to give, or rather something had to be taken away. Steve Jobs, the then Chief Executive of Apple Computer published a letter entitled “Thoughts on Flash”, where he bluntly stated that Flash was never going to appear on another Apple product. Given Apple’s share of the device market, it was the beginning of the end. It was a matter of a few years before a significant number of manufacturers followed suit, and the writing was on the wall – in animated letters – for Flash.

It goes to show that being successful – Internet successful – can come at a heavy price. In it’s day Adobe Flash was the King, Queen and Ace for any media web site, today it’s HTML5 and a basket of other products. Perhaps the moral of the tale is that no piece of software should attempt to be all things to all people, and any effort to do so will ultimately cause the product to fail.

A Point of Review

Transcript of podcast below.

I get the feeling that YouTube may probably contain a review of almost any product available on the planet. I could, for example, watch reviews featuring someone unboxing a wet toaster, showing off a new pair of jeans or even eating army rations of the world. In each case the reviewer is taking a finished product and passing their judgement about it to their audience. I’ve got the feeling that I could probably spend longer watching reviews of a product that I would actually using the object of the review itself, such is the selection of alternative viewpoints.

But what if the product wasn’t finished? Would you trust a review of a car despite the fact that it had no engine in it? Would you trust the review of a pair of jeans if the pair on display only had one leg? I think we’d exercise a healthy dose of scepticism. We’d want to see the finished product.

Being sceptical about an unfinished physical object is fair enough but what about the digital world? How do we know that anything’s finished, particularly when software companies continue to push out updates for established, and new, software at an ever increasing rate?

Take computer games as a good example. Game developers now release “early access” games, inviting enthusiastic early adopters to pay for an unfinished game. Now there are individuals who don’t think reviewing unfinished products is legitimate but that neatly illustrates the divide between physical and digital objects.

Most people would think it quite odd if someone were to publish a review of a toaster that didn’t actually toast anything because it wasn’t a complete product. On the other hand if we jump into the digital domain it’s pretty evident that reviews of unfinished digital products are commonplace. Human curiosity is such that people want to know as much as possible about digital products, even if they’re not quite finished yet.

Of course multiple changing reviews of digital products can lead to search engines serving up an out of date review, for an earlier version of the digital product that we’re interested in. This adds to the challenge of working out whether review is worth considering, or not.

The changing nature of reviews reflects our change in expectations for digital products. In the physical world I don’t expect my jeans to suddenly sprout a pair of pockets, where none had existed before, but that’s exactly what I expect of my digital products. I expect new features, fixes to known bugs (or problems) and with it a fresh review of that product, produced by the endless army of amateur and professional reviewers on the Internet.

Welcome to the digital age where it’s perfectly reasonable to review a car without an engine or a pair of trousers with only one leg.

On Loan

Transcript of podcast below.

If I wanted to start a business 30 years ago I’d probably have gone to the bank, obtained a loan, put up some kind of guarantee against the loan eventually paying off that loan over a period of years.

Of course this being the 21st century we don’t have to look to the bank to start a business, nor even our close friends, although both might still lend a helping hand. The power of the Internet and crowdfunding services enable tens, hundreds or even thousands of Internet “friends” to contribute towards an overall business funding goal.

To pick an example of the success of crowdfunding services, let’s take a look at Kickstarter, one of the more visible services. Since it’s launch in 2009 it’s raised more than $2 billion, has funded nearly 97,000 projects and has nearly 10 million individual backers who have made 27 million pledges in total. Now that’s pretty big, but what about the old fashioned banks?

Well given that Kickstarter is a US service, let’s look at a US bank like Wells Fargo, which loaned $1.9 billion dollars to small businesses in 2015 alone, with the top 100 US banks loaning $15 billion dollars in 2015. If you looked at stock market investment in companies seeking to list on the stock market in 2014, the total was $83.9 billion dollars. Crowdfunding may be the darling of the Internet, but old fashioned ways of raising money for businesses aren’t dead yet. I guess a good question to ask is this: why not?

I guess this is where we get to the truth of it. It doesn’t matter where you attempt to raise funding from, prospective investors will always want to know the same thing: what’s the product, what’s the plan, and just who is going to buy this thing anyway? The more convincing, and realistic, the business plan and product is, the better the chances of raising money. If you look at crowdfunding, banks and stock markets, each has a different appetite for risk, and the less risky a proposition looks, the more investment cash it’s likely to attract. Perhaps it’s time to take another look at crowdfunding – what kind of products are raising the cash?

I’m not sure about you but could you see a major bank funding a project called “Exploding Kittens”? I’m not sure that I can but I do know that crowdfunding helped raise $8 million for the project. In fact if you take a look at the list of crowdfunding by project you’ll notice that there’s a distinct trend towards video games and computer gadgets.

In other words crowdfunding fills a niche for projects and products that might carry too much risk for a traditional bank, or the stock markets to take a chance on, for now. Perhaps crowdfunding will become a mainstay for new business funding in the future – but that all depends on individual investors appetite for risk, and for that matter their taste in exploding kittens.

It’s Evolution Baby!

Transcript of podcast below.

I’d like to throw some large numbers at you. How many transistors are there in Intel’s latest 18 core Xeon Chip? The answer is 5.5 billion. How many neurons are there in the human brain? There’s a staggering 86 billion of them. It’s worth noting that the part of our brain that does most of the thinking contains about 16 billion neurons, all the other neurons do specialist tasks like dealing with hearing, vision and speech, and so on.

Now comparing an Intel Xeon chip to the human brain is a little strange given that there’s no direct equivalence between a transistor and a neuron. On the other hand 5.5 billion transistors on a tiny CPU is quite a feat of engineering although it’s not a lot compared to the 86 billion neurons in the human brain.

Now consider this. It took Intel 44 years of design evolution to produce a 5.5 billion transistor chip, starting back in 1971 with the 2,300 transistor Intel 4004. It took humans about 30 million years to evolve an 86 billion neuron brain. Computer chip evolution is proceeding at a pace that living organisms should be quite jealous of.

Computer chips will undoubtedly become much more sophisticated in the coming decades but what of the human brain? Well there’s not much evidence that we’re going to get a lot more intelligent unless some dramatic evolutionary change occurs. In fact there’s some evidence that the human brain is actually shrinking.

If we accept that human beings might not get brainier, what about the computer chip? The co-founder of Intel, Gordon Moore published a paper in 1965 stating that the number of transistors on a chip would double every two years. Whilst Moore’s law isn’t an accurate prediction of transistor count these days, if it were true someone would design a computer chip with more transistors than the human brain by 2023.

Raw transistor counts don’t mean much without software that does something, and whilst artificial intelligence software can do some quite clever things, no-one’s close to creating a virtual human brain, capable of the vast number of simultaneous tasks that the human brain has to cope with.

Whilst I don’t think human beings will be using computer chips to augment their intelligence in the very near future, I’m guessing that it’ll be inevitable given the pace of change in computer sophistication and complexity.

When that day arrives, posthuman evolution will begin, and given the rate of evolution of computer components, posthuman evolution may be exponentially faster than anything we’ve seen before.