Many people lost a lot of money investing in non-existent data compression software because well:established principles of information theory were ignored. This article is based on a presentation to the 2010 NZ Skeptics conference.

In the late 1990s, Nelson man Philip Whitley claimed to have invented a new data compression technology worth billions of dollars. Over the next decade money was raised on a number of occasions to develop this technology, culminating in a company called NearZero Inc raising $5.3 million from shareholders. According to a well:established body of theory, Whitley’s claims were obviously false. Unsurprisingly, within a few months of NearZero’s formation, it was in liquidation, with its funds gone.

I thought the saga of NearZero could be of interest to skeptics as it involves claims that were clearly false according to well&#8211established theory, and those claims cost investors a lot of money.

But first, a quick introduction to how data is stored by computers, and how that data can be compressed. Computers store data digitally, using the digits 0 and 1 in a binary code. A piece of storage capable of storing a 0 or a 1 is known as a bit (short for binary digit). With 1 bit we can store two values: 0 and 1. While this might be enough to store a simple data value (such as whether someone is male or female), for most pieces of data we need to store a larger range of values. With each bit we add, the number of possible values doubles; by the time we get to 8 bits we have 256 different values. The byte (a group of 8 bits) has proved to be a very useful unit of storage; storage sizes are usually quoted in bytes.

Character data is usually stored 1 byte per character (in European languages). Lower case ‘a’ is represented as 01100001, for example. A picture is a grid of dots. Each dot is called a pixel, and usually 4 bytes are used to encode the colour of a pixel. Standards are needed so that everyone interprets bit patterns in the same way.

Data representation methods are often chosen based on how easy it is to process the data. Often, the same data can be stored more compactly at the cost of making it harder to process. The process of translating a piece of data into a more compact form (and back again) is known as data compression. Compressing data allows us to put more data onto a data storage device, and to send it more quickly across a communications link. The size ratio between the compressed version and the uncompressed version is known as the compression ratio.

In ‘lossless’ compression, the uncompressed data is always identical (bit for bit) to the original data we started with. A compression method designed to work with any type of data must be lossless.

In ‘lossy’ compression, we are willing to accept small differences between the original data and the uncompressed data. In some situations we do not want to risk data being changed by compression, and lossless methods must be used. With images and sound, small changes that are difficult for humans to detect are tolerable if they lead to big space savings. The JPG image format and mp3 video/audio format have lossy compression methods built in to them. Users can choose the tradeoff between quality and space.

A question of pattern

For it to be possible to compress data, there must be some pattern to the data for the compression method to exploit. Letter frequencies in English text are well known, and could be the basis for a text compression method. We can do better if we take context into account. The most frequent letter is ‘e’ (12.7 percent), but if we know the next letter is the first in a word then ‘t’ is the most likely (16.7 percent). If we know the previous letter was (q) then the next will almost certainly be (u). A compression method that takes context into account will do better than one that doesn’t, as the context-based one will be a better predictor of the next symbol.

Likewise images are not random collections of coloured dots (pixels). Rather, pictures typically include large areas that have much the same colour. Sequences of frames in a movie often differ little from each other, and this can be exploited by compression methods.

The effectiveness of a compression method depends on how predictable / random the data is, and how good the compression method is at exploiting whatever predictability exists. If data are random, then no compression is possible. In these cases compression methods can actually create a compressed file larger than the original, because the compression methods have some costs. A compressed file is much more random than the uncompressed version, because the compression method has removed patterns that were present in the original.

In many branches of computer science it is important to establish the best possible way in which something could be done, to serve as a benchmark for current methods. In information theory, Shannon’s entropy is a measure of the underlying information content of a piece of data. A 1000-character extract from a book has more information content than 1000 letter ‘x’ characters, even though both might be represented using 1000 characters. To quote Wikipedia: ” Shannon’ s entropy represents an absolute limit on the best possible lossless compression of any communication” . Modern compression algorithms are so good that ” The performance of existing data compression algorithms is often used as a rough estimate of the entropy of a block of data” . In other words, it is not possible to achieve large improvements over current compression techniques.

The claims

It is time to have a look at Philip Whitley’ s claims. He claimed that he could compress (losslessly) any file to under seven percent of its original size, but this is not credible. Compression potential varies widely depending on patterns in the original file. Many files are already compressed, so have little potential for further compression. Even for uncompressed files, seven percent is achievable only in exceptional cases (English text entropy means the best achievable for English text is around 15 percent).

If it was possible to compress any file to less than seven percent of its original size then it would be possible to compress any file down to 1 bit. The first compression takes you down to under seven percent of the original file. Given that Whitley claimed his technique worked on any file, we could then compress the compressed file, reducing it to less than 0.5 percent of the original size, and so on.

Initial tests of Whitley’s technology were done on one computer. This made it easy to cheat. The ‘compression’ program can easily save a copy of the original file somewhere on disk as well as producing the ‘compressed’ version. Then, when the compressed version is ‘expanded’, the hidden copy can be restored. Whitley remained in control of the equipment, ostensibly to prevent anybody from stealing his software.

Critical assessment

Philip Whitley’s company Astute Software paid Tim Bell (an associate professor of computer science at the University of Canterbury) for an opinion on the technology. Tim Bell has an international reputation in the field of data compression; Microsoft has used him as an expert witness, and he has co-authored two well-known compression textbooks. An irony of the NearZero case is that New Zealand has more expertise in this field than you might expect for a small country (the co-authors of the two text books are New Zealand-born or live in New Zealand).

Tim Bell’s views were blunt: “The claims they were making at the time defied what is mathematically possible, and were very similar to claims made by other companies around the world that had defrauded investors.” One of his criticisms was that the tests were not two-computer tests. In such a test the compression is performed on one computer and the compressed file is transferred to a second computer, where it is decompressed. A two-computer test prevents the hidden-file form of cheating. It is reasonably easy to monitor the network cable between two computers, to check that the original file is not sent in addition to the compressed file (though the tester must be alert for other possible communication paths, such as wireless networks).

A two-computer test was subsequently conducted, and described in a 14-page report by Titus Kahu of Logical Networks. At first glance the report looks impressive, but on closer reading flaws quickly emerge. The two computers used were Whitley’s. The major flaw was that Kahu was limited to testing a set of 24 files selected by Whitley. The obvious form of cheating this allows is that the set of files can be placed on the second computer before the tests. Then all that the first computer needs to do is to include in the ‘compressed’ data details of which file is required (a number between 1 and 24 would suffice). The receiving computer can then locate the required file in its hiding place.

Titus Kahu did check the receiving computer to see if files with the names of those used in the test were present, but you would expect that someone setting out to deceive would at the very least rename the files.

The report makes for interesting reading. The files were of a number of types, including text files, pictures in JPG and GIF formats, MP3 audio files, and tar files. A tar file is a way of collecting a number of files together into a single file (zip files in Windows serve the same purpose).

One would expect text files to compress well, but JPG, GIF and MP3 files to compress poorly (they are all compressed formats). How well a tar file will compress depends on the files that it contains.

A simple comparison

To get some data to compare with the results in the report, I ran some tests using gzip (a widely used lossless compression method) on some text, tar and JPG files. I managed to locate two of the tar files used in the Titus Kahu tests: Calgary.tar and Canterbury.tar. Gzip achieved savings of 67.24 percent and 73.80 percent (so Calgary.tar was compressed to about one third of its original size, and Canterbury.tar to about one quarter). I also located three text files that were later versions of text files used by Kahu: on these Gzip achieved savings of 63.08 percent, 62.05 percent, and 70.77 percent. I also compressed a JPG file using gzip, and achieved a saving of 2.34 percent.

There are no great surprises in my results. There was quite a variation in the compression achieved, even amongst files of the same data type (the three text files for example). Compressing a JPG file gave little extra compression (not enough to make it worth further compression with gzip).

By comparison, savings in the report were 93.52 percent for four files and 93.53 percent for the other 20. I suspect that the difference in the fourth significant figure is due to rounding the file size to the nearest byte. These results are not remotely believable. The compression achieved is too good to be true even for data that compresses well (such as text), let alone for data formats that are already compressed. The incredible consistency of the compression achieved is also not credible.

Downfall

Having looked at some background, it is time to look at the chain of events that culminated in NearZero Inc’s rise and fall. Philip Whitley’s early forays into business were not promising. In 1995 he was adjudged bankrupt (discharged in 1998). In 1997 he became a shareholder in Nelic Computing Ltd, which went into liquidation in 1999, owing unsecured creditors $70,000.

In 1999 Philip Whitley formed a software company (Astute Software) with a number of Nelson investors (who put in $292,000). Astute worked on a number of projects, and developed the data compression technology. In early 2001 the ‘one-computer’ tests were done, and Tim Bell’s opinion was sought. In mid 2001 the logical Networks ‘two-computer’ tests were done by Titus Kahu. In 2002, a Mr Cohen (an investor) asked for a (long-awaited) copy of the compression technology; he was told by Philip Whitley the only copies had been accidentally burnt when cleaning out his safe. Later in 2002 work stopped due to Whitley becoming ill.

In 2005 Whitley resumed work on the technology. Some of the original investors put in a further $125,000. On 10 July 2006, NearZero was incorporated in Nevada, with Philip Whitley as president, treasurer and sole director. Later in 2006 Titus Kahu became engineering director for Syntiro (a Philip Whitley company doing development work for NearZero) on the generous salary of $250,000 a year.

In February to April 2007 NearZero share purchase meetings were held in Auckland, Wellington and Christchurch. A total of 490 investors invested $5.3 million. The investment opportunity brochure forecast that the near-term NearZero market capitalisation would be US$482 billion to $780 billion, and was expected to exceed one trillion US dollars. Note that the largest company in the world, Petrochina, is a US$405 billion company, and the largest US companies, including Exxon Mobil, Apple and Microsoft, are in the 200 to 300 billion bracket.

Things quickly went wrong. In May 2007, the Securities Commission started investigating the legality of the NearZero share offer (there is no registered prospectus, for example). Also in May, Price Waterhouse Coopers (PWC) were appointed as interim liquidators for NearZero, and moved to sell houses and cars. In June, PWC said $218,000 went to Richmond City Football Club, $523,000 on vehicles, $852,000 on property, $683,000 to US-based company secretary Sherif Safwat, and $270,000 on household expenses. They found little evidence of money spent developing compression technology.

In June Whitley invited investors to contribute money to fund legal action to prevent liquidation. Also in June PWC found no evidence of any compression technology. Whitley claimed to have wiped it; PWC found no evidence of use of wiping software.

Then in July Whitley made some rather curious statements in an affidavit sworn in relation to the liquidation: “I will however say that it isn’t binary and therefore not subject to Shannon’s Law of algorithmic limitation.” If there was a real technology that was not based on binary it is hard to see it being of widespread use in computer and communication systems that store, transmit and process all data in binary. The affidavit continues: “Shannon’s Law is attached to this affidavit as Annexure “Y” and it can be seen that this is a 1948 paper”. Claude Shannon founded information theory, which is the basis of how digital computers represent data (according to one tribute, the digital revolution started with information theory). Shannon coined the term bit, and introduced the concept of information entropy referred to earlier. It is interesting that Shannon’s fundamental research results are dismissed as being in “a 1948 paper”.

He also stated: “In regard to the item 3/ I have never asserted that the technology is based on an algorithm”. In computer science, an algorithm is simply a description of how to do something in a series of steps. A common analogy is to say that a cooking recipe is an algorithm for preparing food. If Philip Whitley’s compression technology is not based on an algorithm then that implies it cannot be described as a sequence of steps, and therefore cannot actually be implemented!

In November, Associate Judge Christiansen ordered NearZero’s liquidation, and ruled that the compression technology had no value. Then in August 2008 Whitley faced the much more serious charge of making fraudulent claims about his technology.

In September 2008 all shareholders were given the option of keeping their shares or getting their money back. They proved to be remarkably loyal: $3.1m voted to stay in; $2.2m voted for reimbursement. I’m not sure whether there was any money to reimburse those who voted that way (probably not). In August 2009 Philip Whitley was convicted and fined for making allotments without having a registered prospectus.

The trial

In February 2010 the fraud trial began in Nelson. Whitley was charged with making a false statement as a promoter between July 2006 and May 2007. There were many sad stories in the Nelson Mail about wasted money and time (and resulting stress). Some of the information to emerge in the trial:

  • Philip Whitley hired a team of seven body guards headed by “Oz” (Oswald Van Leeuwen), who was on a salary of $300,000. This level of security was needed because of the (supposed) enormous value of the compression technology
  • According to Sherif Safwat, Philip Whitley believed a Chechnyan hit team had arrived in New Zealand on a Russian fishing boat.
  • Philip Whitley: “The [security guards] said that the Russians were trying to penetrate and we ended up with security guards living in my house, camped on the floor … I couldn’t go out of the house without having security … it just built up inside me to the point where I just lost it from a point of paranoia.”

In his summing up on May 27, the defence lawyer said:

  • “Whitley had a distorted view of reality which led him to believe his data compression technology was real.”
  • “… [we are] not challenging the evidence of … Prof Bell that Whitley’s claimed invention was mathematically impossible.”

In July Philip Whitley was found guilty on two counts of fraud (but maintains he still has his inventions).

On August 10, 2010, he was sentenced to five years and three months in prison.

The NearZero mess should not have happened. New Zealand has more researchers in this field than you would expect for a country of this size. One of the most prominent, Tim Bell, clearly stated in 2001 that the claims were false. However, investors still committed (and lost) millions of dollars over a number of years. Compression claims are easily tested (much more easily than medical claims, for example). Whitley refused to allow his technology to be independently tested using the excuse of protecting his intellectual property. Many people have been harmed, especially the investors. Moreover, this type of case is not good for the reputation of the IT industry, which struggles to attract investment.

I was asked at the conference how non-technical NearZero investors could have protected themselves. I had no answers at the time, but have given it some thought since. Some things they could have done:

  • Google the names of the company principals.
  • Check to see how the predicted market capitalisation compared to that of existing companies. Finding that the lowest estimate would make NearZero the biggest company in the world should have lead to some scepticism.
  • Google the terms ‘data compression’ and ‘scam’.

Much of the information in this article is based on the Nelson Mail’s extensive reporting of the issue, for which they are to be congratulated. Another good source of information was nearzero.bravehost.com, a website set up by and for NearZero’s shareholders in 2007 in response to the liquidation of NearZero. An article by Matt Philp on Philip Whitley and NearZero appeared in the October 2010 issue of North & South.

Recommended Posts