At 2:45 pm on May 6, 2010, Wall Street essentially had a heart attack. In just minutes, the stock market plunged 1000 points, for reasons traders, analysts, and business media could not explain. The “flash crash” wiped out $1.1 Trillion of investor dollars and even though most of that was quickly regained, it left the market badly shaken.
What happened? It appears that a single keystroke error was to blame. The letter “B” was inserted in a sell order instead of the letter “M”. Billion was input where Million should have been and it triggered a ripple effect through the automated financial markets. Costly errors in the events business might not have as many zeros as that epic fail, but when it’s your event or your exhibitor who has to deal with a problem caused by a keystroke mistake, it can seem just as bad.
Today a surprising amount of venue managers and event organizers still work with separate CRM, operations, and financial systems that either require them to manually enter data multiple times, or have one-way information flow from system to system that can get out of sync. The result is costly – and often embarrassing – errors that stem from bad or out-of-step event detail data.
But how acute is this problem? How exactly does it bleed energy and money from your organization? There are several ways in which poor or manual information flow can hinder your events business.
The first issue is the cost of having a mistake creep into your information systems, customer orders, service or operation orders, or billing. You are particularly vulnerable if you have any manual “double-entry” of data from system to system.
The (Flawed) Human Element
If you take the average benchmark of a 1% error rate in manual data entry, what does that mean in your business? How many records or fields are your employees entering into the system each day? If you now consider that 1 of every 100 instances is likely erroneous you’ll get a sense of your exposure. Now multiply that mistake (and add subsequent ones) for every time information is moved from one system to another. It is not uncommon for event administrators to operate with systemic data problems that constantly need to be monitored and cleaned up – as just “part of the job.”
Human ability to catch or avoid errors within data sets is inherently flawed. In a controlled study in 2009 at UNLV, 215 students were given 30 data sheets that contained six types of data to process. With only visual confirmation of data correctness the students made an average of 10.23 errors. However, when double entries were automatically checked for matches by an automated system, that average dropped to 0.38 mistakes. And when information is only pulled from a single unified database, requiring NO double entry or data migration, there is virtually zero chance for human error.
The $1-$10-$100 Rule
A common business concept is the 1-10-100 rule. This rule-of-thumb model illustrates the hard costs to an organization chasing mistakes and that failure to take notice and correct mistakes escalate in cost the later they are realized.
It costs: $1 to verify the accuracy of data at the point of entry, $10 to correct or clean up data in batch form, and $100 (or more) per record if nothing is done – including the costs associated with low customer retention, and inefficiencies (source: totalqualitymanagement.wordpress.com).
In other words, a shared database prevents the time and costs of rekeying and verifying information entered into separate disconnected systems. A single database for CRM, exhibition management, and financials also eliminates costly and embarrassing mistakes that are created with disconnected systems.
In the second half of this blog series, I discuss some of the costs associated with quality assurance issues and efficiency repercussions, and provide some solutions for these problems.