Sunday, December 20, 2009

The Down-Side of Cloud Computing

Over the last 50 years, with exponentially increasing power and rapidly declining costs, information technology has proved to be an irresistible and inevitable force. Cloud computing represents the continuing rapid advance of information technology.

Cloud computing is a process of outsourcing and applying information technology. It gives organizations huge cost savings, it frees up cash that’s otherwise tied up in hardware, and it allows organizations to focus on their core competencies rather than trying to manage systems and specialists that they are ill-equipped to manage.

I am not going to get into the weeds here, but I do want you to know that there are a number of new, complex technologies operating in the Cloud. The independence of the different technologies contributes a measure of chaos to the Cloud that makes it almost impossible for non-techies to grasp.

I want to share with you several of my experiences dealing with problems and the technologies used in the Cloud.

Troubleshooting a Virtual Machine. Virtual machines operate like real machines -- except when the virtual machine crashes.

Here’s what I mean. One of the first steps in resolving a system problem in Windows is to look in the event logs and see what errors there are. If you are lucky, that will give you a clue as to where the problem lies and how to fix it.

But on a virtual machine, the logs can be misleading. For example, I was troubleshooting a problem on a virtual machine and I saw in the event logs of the virtual machine where there was a hardware disk error. Well, if that was a real error, I’d expect to see a similar error in the event log of the host or real machine. But the event log of the host machine was clean, meaning that the physical drive was OK. So, what does it mean to have a hardware error in a virtual hard disk? That is a head-scratcher. What do you do? Reboot the virtual machine and hope the problem goes away.

Recovering Data from a Virtual Disk. In a different situation, a sys admin was performing routine maintenance on a host machine, patching and updating the host operating system.

Unfortunately, one of the files associated with a virtual machine was inadvertently erased. I was called in to restore a backup of the virtual machine on a backup host machine. This is fairly easy to do and is one of the selling points of virtual machines. I had the backup of the virtual machine up and running quickly.

The wrinkle here is that there was a period of time between the backup and when the data was erased. There was data on the virtual machine's disk that was lost that we wanted to get back.

Data recovery tools deal with the physical media, looking for electro-magnetic shadows or ghosts of deleted files. Data recovery tools are ill-suited to finding lost data on virtual machine disks.

In this case there was a one-to-one map between the physical machine and the virtual machine. So we had physical media to look at. That's why I restored the backup to a different host. This was a best-case scenario for recovering data from a virtual disk.

But virtual machines are a handful of files on a physical device. Hundreds of thousands of virtual machine files are packed into those few physical files. To recover the lost data, we had to be able to completely recover the LARGE missing physical file and instantiate the virtual machine as it was at the time the virtual machine was erased.

We turned to Kroll-Ontrack, the market leader in forensic data recovery. Unfortunately, they were not able to put Humpty-Dumpty back together again with the tools they have.

So, it is interesting to note that while virtual machine technology is sold as providing enhanced data security, at least in this scenario, the opposite was true.

Running Amok. In the last scenario, I glossed over the matter of the Sys Admin's error. But this is a serious problem.

With cloud computing and virtual machines, Sys Admins are forced to access resources remotely. Sometimes to work on these systems, we will have several remote sessions open on one physical PC, meaning that we have several desktops open and toggle between them. All of the desktops look the same. You constantly have to ask yourself, “Where am I and what am I doing here?” One trick I know to reduce confusion is to change the wall paper on different desktops so different remote machines do not all look the same as I toggle among them.

Confusion is not the only pitfall for system administrators. Here’s a completely different example of Sys Admins running amok.

Recently, a graduate school student at a nearby University took a browser-based, online exam run by the school for one of his classes. A complex web of technology underlies the school’s testing system. It uses web services to link different University databases and a BlackBoard content management system. Questions and answers traveled the University's intranet, the Internet and a WiFi access point in the classroom. The student used his own laptop to take the test.

Unfortunately, there was a technical glitch and the school’s IT department accused the student of cheating on the exam based on irregularities in the BlackBoard log of student’s exam session.

In Kafka-esque fashion, several hearings on the matter were conducted. Each time the school’s IT department insisted that they were the experts and the BlackBoard log could only mean that the student had cheated.

I testified on the student’s behalf in the final hearing. I told the hearing that one anomaly in one link in a long chain was not a smoking gun. I said that the IT department had jumped to a conclusion without looking into the matter completely. What did the web server logs show? What did the logs on the WiFi access point show? What about the student's laptop? The IT department never looked, and it was then months later and too late to go back and look.

Rather than appearing and acting like experts, in my view, the school's IT department was inept and vindictive in this case. Fortunately for the student, he was found innocent of the charges in this final hearing.

As cloud computing spreads, the roles and responsibilities of the technical staffs that support it will grow in importance. The consequences for organizations and the public when these technicians run amok will grow as well.

Scaling-Up. These different examples I have described are all small. The consequences were not large and/or not felt beyond a few individuals. But it is not hard to see how easily and well these examples can scale-up in the cloud.

That same software bug that cripples my Exchange server might just as easily interrupt service on Microsoft's Hotmail or MSN email services.

But while most of us can tolerate a short service interruption – be it in email, Internet access or electricity – many of us cannot tolerate data problems. More and more of us are foregoing hard-copy and snail-mail for electronic data stored in the cloud and email. When data is lost or privacy is breached, the consequences and the costs can be painful. Occurring in the cloud, involving Microsoft of Google, millions of people could be affected.

At the systems and architectural level, there is fault tolerance built in. The Internet is highly redundant and it is supposed to be able to function during a nuclear attack. Web services work together, but they are “loosely coupled” so that a problem in one system does not spread to another.

When airliners destroyed the World Trade Center towers, an economic meltdown was averted because a bomb blast in the building 10 years before had caused financial firms headquartered there to construct clouds where data could be stored more securely. And that worked in 2001. Now most financial and other commercial data of large organizations is stored in clouds.

But, psychological and political factors are such that the technological checks and balances may not be enough to stave-off an economic meltdown. It has been reported that the Chinese, our enemies and terrorists are capable of attacking soft targets on the Internet (read the Cloud). Hurricane Katrina, last year’s financial crisis, pandemic flu… Events like these could conceivably impair the cloud and cause data loss, triggering panic and an economic meltdown at some point in the future.

Recommendations. Now I've got some recommendations for you, assuming you were a client of mine looking at cloud computing; if you are planning on outsourcing one or more IT functions.

  1. Perform your due diligence. You need to be confident that the vendor you are going to use has the financial, technical and managerial resources to deliver and survive. In this vein, you might want to monitor audited financial statements of the vendor on an ongoing basis after you start working together.
  2. Does the Service Level Agreement promise what you need/expect? Chances are that you are not going to be able to negotiate the terms and penalties of the SLA. It is usually a take-it-or-leave-it situation for small businesses. But you should leave it if the SLA is not what you want or it is unbalanced.
  3. Insurance. Insist that the vendor has Professional Liability coverage. Some general business liability insurance policies specifically exclude liabilities for data processing activities. Look at the vendor's certificate of insurance. See if you can get listed as an additional insured on the vendor's policies. Make sure the policy limits are not too low. Look at the per-occurrence limits too.
  4. Be prepared to sue. As we all know, “IT happens.” In the event of a bad outcome, neither the vendor nor his insurance company is going to offer to make you whole. They will start by offering you pennies on the dollar. You are probably going to have to sue them to be made whole. Make sure that the contract you sign with the vendor lets you recover your legal fees if you win a judgment against him. You need that language in some jurisdictions, and without it your compensation might be significantly reduced.

Sunday, November 29, 2009

Google Fumbles

The answer to almost every question can be found using Google. But every now and then I come across a situation or a problem where the answer is not to be found on Google; at least not near the top of the pile. And believe me, I am pretty good at querying Google to get the answers I need.

Only by publicizing such Google "fumbles" can we hope to correct them. So, let's get the word out about the questions that cannot currently be answered by Google and let's get the answers!

Problem #1:

Small Business Server 2003 Backup fails approximately two days out of five. It is set to backup Monday - Friday at 8 PM. When it fails, the backup log show, "Error returned while creating the volume shadow copy:0x8007000e. ... Not enough storage to complete this operation." However, storage is not the problem. One fix suggested on Google is to set the Volume Shadow Service to start automatically. But that doesn't solve the problem.

Problem #2:

A different instance of Small Business Server has started reporting an error when you try to load the Monitoring and Reporting page from Server Management. Server monitoring and reporting emails also report this error.

The page cannot be displayed
An error occurred on the page you are trying to view.

To work around this problem, perform the following steps. After each step, try again to access the page.
  • Ensure that the MSSQL$SBSMONITORING service is started.
  • Ensure that the server is not low on memory or disk space.
  • Restart the server.
  • Verify that the server is functional and that there are no system-wide problems.
  • Run the Set Up Monitoring Reports and Alerts task in the Server Management Monitoring and Reporting taskpad.
None of this works.

Sunday, October 25, 2009

Swine Flu: National Emergency?

With the Swine Flu vaccine lagging and the virus widespread, what's the plan here? Call the Army back from Afghanistan to make sure we wash our hands and cover our mouths when coughing?

What national health emergencies can we expect in the future? Alcohol, tobacco, obesity, illegal drugs, firearms, AIDS, Alzheimers, homelessness, ...

As the Federal Government gets ready to over-take health care, don't worry: your health and happiness will be a Federal matter. Orgy-porgy time!

Wednesday, September 09, 2009

JungleDisk Works!

We are busy migrating business clients to cloud storage. The enabling technology for our users is JungleDisk software that allows us to map a drive on the user's PC to his/her/their files in cloud storage at Amazon. Here are the business cases we are dealing with:
  • Instead of using for online backup, JungleDisk and Amazon do the job for dimes instead of dollars.
  • Instead of replacing an aged Microsoft Small Business Server machine, Google Apps (Gmail) and JungleDisk/Amazon do the work better and MUCH CHEAPER.
  • Instead of hauling a laptop around the country, with the inherent risks of loss of data and confidentiality, a road warrior can access his files stored at Amazon from any machine with Internet access running JungleDisk software off a USB key.
As long as the user has broadband Internet access, cloud storage works well.

Here's one tidbit we've learned from our experience to date:
  • JungleDisk Workgroup Edition does not support file locking. This means two users can have the same file open. The last user to save their changes will overwrite the changes of the first user to save.
Here's a work-around that we've had success with:
  • Any document/file that will be worked on collaboratively goes into an aptly named cloud folder . If/when anybody wants to work on a file in such a folder, they can:
    • Rename the file while it is being edited OR
    • Cut and paste the file to his/her desktop to work on it.
People have been pretty good about following the rules, and we have not had problems with losing information because two people were working on a document, spreadsheet, database, etc. at the same time.

Wednesday, August 19, 2009

JScript Compilation Error

After reformatting and a fresh reload of Windows XP-Pro on a pair of client machines, I started getting a JScript Compilation pop-up error message on both machines after installing Java, then navigating to web pages with Java content.

The fix is simple, but I had a hard time finding it on the web. Here it is:

Open the browser. Click on Tools on the Menu bar, and click on Internet Options on the drop-down menu. Click the Connections Tab in the pop-up window. Click the LAN Settings button. In the pop-up window, UNCHECK the box associated with Automatically Detect Settings. Finally, OK out of the open pop-ups.

That's it. The error stops popping up. This fix worked on both the client machines. FYI, one of them was a laptop and the other was a tower PC.

What's the problem? Who knows. It's probably an issue that Sun and Microsoft are squabbling about.

Friday, July 17, 2009

Forcing IE 8 Uninstall

Need to uninstall Internet Explorer 8? The uninstall fails because files are missing that need to be copied to restore Internet Explorer 7? Here's a solution that I stumbled upon and that worked for me.

  1. Download and save the IE 7 installation file from Microsoft.
    • This is an .exe file. When you launch it, it expands the installation files in a temporary directory and then starts the install.
    • When the installation starts, you get an error message that there is already a later version of Internet Explorer installed on the system and that IE 7 cannot be installed.
    • If you click OK, the install stops and the temporary files disappear.
  2. Launch the IE 7 installation file.
  3. When the error message appears, leave it up and run the IE 8 uninstall routine.
  4. When the IE 8 uninstall routine does not find files it is looking for, select Browse and point it to the temporary folder created by the IE 7 install. That way the IE 8 uninstall routine will find most of the missing files.
  5. Several more missing files can be found by searching for them on the C: drive of the machine.
  6. In one case, I "found" a missing dll.000 file by copying a dll file on the C: with the same name and naming the copy dll.000.
  7. In this way I was able to successfully uninstall IE 8! Not unexpectedly, the resulting IE 7 was somewhat quirky, so I then launched the IE 7 installation file and had it do a "clean" or "complete" install which is without quirks.
Pretty cool!

Office Open XML confusion

Office Open XML is the Microsoft specification for Office 2007 files that was adopted as an ISO/IEC Standard in 2008. Among the benefits touted for Office Open XML is that one can open, edit and generate these files without needing to buy Microsoft Office 2007. To date, however, there are no practical alternatives to using Microsoft Office to work with these files.

Recently, a client of ours was baffled by the challenge of opening Open XML documents sent to him by Federal procurement officers. Our client received zip files which he logically unzipped. That left him with a relatively large number of xml files which he could open in a browser, but which were nonsense.

What to do? The answer is simple but HARD TO FIND on the web.

  1. Don't unzip the zip files. Rename them, changing the extension of each from .zip to .docx (assuming it is a Word document).
  2. Download and install the Microsoft Office File Compatibility Pack (if you are using Office 2003).
  3. Double-click on each of the .docx files, and it should open properly in Word.

Monday, July 13, 2009

Corporate Chivalry

For the moment anyway, corporate chivalry lives on the Web.

Chivalry refers to the system of beliefs and practices originating with knights in the middle ages. Gallantry toward women was one of its core principals. And that has given chivalry a bad odor in modern times.

Today I discovered that has started adding people's approximate ages to search results, providing a 5-year age range in addition to a person's address and phone number. Having satisfied myself that the ranges were accurate by searching for several people whose ages I know, I cast my net more widely.

At this point it became clear to me that ages are available from for men only. None of the women I searched for had their ages listed. All of the men I looked for came with ages.

In these enlightened days, I do not expect this vestige of medieval times, this chivalry, to last long. Ladies, prepared to be revealed! Or, will Switchboard choose to reverse its discrimination and stop disclosing gentlemen's ages too?

Sunday, June 28, 2009


The last resort in tech support is RTFM (read the freaking manual)!

It is almost impossible to find the answer you need quickly in the manual. Who wants to write a manual? How much to manufacturers want to pay for a manual? Bottom line, the manual may be written by someone with limited English proficiency and no experience with the product.

For a "just-in-time" answer to a practical problem, Google is your best bet. But crafting a query that describes the problem concisely is not always easy. And any proposed fixes may not work, or, worse, they may compound the problem. So, skepticism is necessary and care must be taken to avoid making matters worse.

Keystone Computer supports several Cisco Small Business RV082 dual-WAN routers. Our experience with this device illustrates my points today. Here's some misinformation I got from the net via Google:

  • "Just in case no one told you, "none" of the WRV54G and RV0XX series support passing the "GRE" protocol (47), so, that's why you get stuck at "verifying network" and can't use the microsoft vpn client (yeah, sucks azz...). This was done purposely to force people to use the quickvpn client."
    • Not so; the RV082 usually plays nicely with Microsoft VPNs. IFF you enable one-to-one NAT do you have problems with protocol 47 when VPN'ing over forwarded ports.
And here's where I was confused and led astray by the manual:
  • "One-to-One NAT opens the firewall for one network user a lot like the DMZ host feature. In this feature, however, the network user is restricted to a single website."
    • That second sentence is hogwash.
  • "NOTE: One-to-One NAT does change the way the firewall functions work. Access to machines on the LAN from the Internet will be allowed unless Network Access Rules are set."
    • This is bull hockey. There is nothing you can do with Access Rules to limit access to a machine that has been exposed via one-to-one NAT.

Fortunately, this story has a happy ending. With a minimum of fuss and wait, I was able to call and talk to a Cisco Small Business engineer. He knew what he was talking about. I let him log into the router remotely and configure it to do exactly what I need it to do for one of my clients. Problem solved!

Wednesday, May 27, 2009

Are you backed-up?

Here's an all-too-familiar scenario. Someone calls our office and says, "Help! My system's dead." Reflexively we ask, "Are you backed-up?" Of course not; would he be calling if he was? Besides, most people do not bother with backups.

So, why do we ask? We do it to remind people that they are responsible for the mess they are in. If/when we rescue them, we become heros; they tell their friends. Round these parts, you can find more than one child named Keystone in gratitude for our work.

The pain and cost of a system failure and resulting loss of data can be substantial. Most businesses that lose their financial records, customer lists, inventory data, etc. they go out of business. But, it is only after a brush with disaster that most people seriously consider developing and implementing a backup plan.

Products exist which claim to provide "backup protection." However, too often, all they provide is a false sense of security.
  • Scheduled backups fail for all sorts of reasons. If you don't monitor the logs, you don't know when they don't happen.
  • People store important files in folders that are not included in the backup set.
  • If the backup goes up in flames with the server, you have no backup.
  • It might take a week to get new hardware to replace the failed machine, during which you are dead in the water.
  • Today's backup overwrites yesterday's so if you want to restore a version of a file from last week, you are out of luck.

So, what's the answer? Are you backed up? The best that most of us can hope for is 'maybe' or 'somewhat'. Unless you are a financial institution or other organization with DEEP pockets, it is almost impossible to be completely backed up; protected against data loss under any possible scenario.

Sunday, April 19, 2009

Server Backup Puzzle

Here's the puzzle: while hard disks have gotten larger and cheaper, the day is still 24 hours long. Backup software seems to process files at 20-25 GB/hour, so if you have more than 500 GB on a server to backup each night, you cannot finish one job before needing to start the next. This may lead to problems.

Why is backup software so slow? For comparison, a straight file-copy job in Windows or the command line goes at the rate of about 100 GB/hr.
  • It doesn't matter if you are backing up to a USB-attached external hard disk, a NAT device or an eSATA hard disk; the rate stays the same (20-25 GB/hr).
  • It doesn't seem to matter what the software is either - Backup Exec, NTBackup, et al.
  • Curiously, backing up a single file, like an Exchange data store, runs more quickly - 40 GB/hr.
My solution, if you can call it that, is to:
  1. Write a batch file to copy files that are not open/locked by running services. Then schedule the batch file to run every night.
  2. Use backup software to backup files that are open/locked.
Kan Yabumoto explains the issue of open/locked files, "As in any multi-tasking environment (including networked environments), when a file is opened by a process (a running program), an attempt to access the file by another process (such as the XXCOPY program) is totally dependent on how the program which opened the file in the first place wants to share it. Often times, the first program is not very cooperative, or just self-centered and won't give any access to anyone else."

Backup software uses the Volume Shadow Copy Service (VSS) to make/use shadow copies of files to get around the access/sharing problem.

There are at least two problems with my solution:
  1. Some if not all of the batch-file/command-line copy commands do not handle long file names gracefully.
  2. It is messy and not always obvious which files need to be backed up versus those which can be copied.
If there is a better way, I'd appreciate it if someone would let me know!

Monday, March 16, 2009

Lessons Learned (again?)

  • Educators say people do not really learn something until they have "learned" it somewhere between 5 and 5,000 times. So, if you are like me, you cannot hear this one too often: BACKUP! Automate your backups and test to make sure that you can restore files from a backup.
  • This one's in the same category as the lesson on backing up; cover your assets. You KNOW you should do it! For example, I recently added an obvious but necessary(?) Limitation of Liability clause to this blog. See the bottom of the page...
  • Here's a new one for you... Active Directory has a secret that it shares with each domain computer. That secret is updated every 30 days. If/when that secret gets out of sync, strange things start to happen, like the administrator username and password not working on a machine. How does this AD secret get out-of-sync? Say a domain computer has been off for a while or let's say you have to restore a backup of the AD server. Good luck troubleshooting and resolving this AD problem!
  • Twitter is the new Facebook/MySpace. It's a waste of time at best. At worst, it is an invitation for bad guys to steal your identity and/or your stuff. It's an old lesson here, most computer problems are really self-inflicted by the user.
  • More backup stuff... Windows SBS/NTBackup is very slow. Backing up to a NAS device goes at the rate of about 12 GB/hr. Backing up to a USB-2 external hard disk goes at the rate of about 24 GB/hr. Looking at the Windows Task Manager, CPU and network utilization is minimal during backups. Copying files goes much quicker. What's the problem with backups? These rates are too slow for a daily backup of 1 TB of data.

Monday, February 16, 2009


"I also wanted to say thank you for all the great work you did during our move. Folks have commented to me that [disruption to] the offsite access to their e-mail and the ftp site was so minimal they barely noticed it was down at all. Also the hook up of the pc’s in the new office was fantastic. I think everyone was surprised to come in Monday morning and be able to get right to work. So, thank you very much – it was a job extremely well done from start to finish."

Pam Sarlouis
Mangi Environmental Group

"This is all Greek to me...that's why we have you. I am so grateful to you for your service and advice. My life is much more pleasant since we found you, believe me!"

Ralph Perrino, PhD
Northern Virginia Tutoring Service

Not every job has a happy outcome and a satisfied client. Still, some of our best work is done in things go awry, when the client is breathing down our necks and asking, incredulously, "Do you know what you are doing?" Yes, even though the client's confidence in us is shaken.

At the outset of our relationship with Northern Virginia Tutoring Service, I drilled through a wall to run network cable from here to there. When I pulled the drill bit out of the wall, there was a knot of twisted, broken wires that was the trunk line of the office phone system. A simple job turned into an urgent problem, outside our normal purview (phones, not computers).

Ralph Perrino was very cool. He would have been fully entitled to throw me out of the office and/or yell at me. Instead, he allowed me to clean up the mess I made. That was not my proudest moment, but I am proud of the relationship forged in that adversity.

Sunday, January 25, 2009

Your Database

Everybody has a database. Your DNA is a database.

Every business and organization has databases - lists, spreadsheets, financial reports, et al. - but does your organization have its "DNA database"? That's a database that monitors and controls the processes that define the organization.

Every organization has one or more DNA databases. One might exist in the mind of an entrepreneur. One might be a cultural thing. One might be a business plan. One might be a ledger book.

Electronic Data

Electronic or computer storage of information is usually the best way to store business information, especially large amounts of it. Computers can automate data processing and communication, increasing the efficiency and reducing the cost of many everyday activities. So, while the time and money involved in planning, implementing and maintaining an electronic database can be daunting (hardware, software, business process reengineering, data entry/capture, data quality assurance, user training, et al.), putting your organization's DNA database on a computer is essential to the organization's success in the information age.

Not all databases are created equal. The worth of a database depends upon factors including:
  • The intrinsic value of the information stored.
  • The quality and cleanliness of the data.
  • The shelf-life of the data.
  • The accessibility of the information in the database.
  • The security of the database.
  • The relevance of the reports and queries designed and the ability to perform ad hoc queries.
  • The accuracy of the data model to the business processes of the organization.
  • The ability of the organization's executives and staff to use the database effectively.
For example, yesterday's lottery number is worthless. Tomorrow's lottery number is priceless, unless.
  • Everybody knows tomorrow's number so the pot is divided a million ways.
  • The lottery tickets are being sold by bodegas in Nicaragua but you are in Nebraska.
  • You got the numbers transposed.
  • This is not the Powerball lottery; it's a church charity lottery.
  • ...

Process Vs. Outcomes Data

There is a saying, "You can't manage what you can't measure." But it does not follow that you can manage what you can measure. Many systems are designed to measure the performance of business units; they measure outcomes or results from period to period. Since they do not track the key variables responsible for the outcomes, it is seldom clear why results are what they are or what changes need to be made (management).

The stock market is a good example. Everyone knows what it is doing from moment to moment; results are continuously updated. Why the market is doing what it is doing is anyone's guess.

In order to measure to manage, an organization must have a good understanding of how inputs that it can control are converted into results (i.e., a business process). Then it can model the process, collect data, establish goals and norms, monitor variances and make adjustments to improve results (manage).

To model a business process, it helps to articulate "use cases" or scenarios that describe the way various constituents (customers, employees, vendors, management, etc.) interact with the process. The use cases provide a perspective for analyzing the process, making it more efficient and building information systems to manage it better.

Stumbling Blocks

  • The "data model" is the foundation of a relational database, essential for monitoring a business process. The data model is the blueprint identifying what data will be collected, how the data will be stored and how different tables in the database are related to each other. Often, little or no work goes into the data model, limiting the effectiveness of the information system. For example, you can buy an off-the-shelf accounting package and implement it using one of the sample charts of accounts (i.e, data models). But chances are that a sample chart of accounts is not right for your operations.
  • Data quality is an issue that organizations sometimes ignore at their peril. "Garbage in, garbage out," as the saying goes. Depending upon the potential cost of data errors, it may be necessary to take heroic measures to identify, correct or cleanse bad data from the database.
  • Many organizations procure and implement systems but fail to invest in the education and training their people need to operate and maintain the system. Managers and executives may not realize that they do not have the background they need to properly use evidence-based, decision-support systems. Because people come and go in the organization, the need for education and training is an ongoing one, requiring time and money. But too often, the budget for such training is nil.


Computerizing your database is going to cost a lot, especially when you consider the hidden costs. Click here for more on this.


In this competitive world, your organization cannot afford to be without computerized DNA databases. The complexity and cost of developing and maintaining these databases should not be underestimated, however. Understanding and commitment is needed at the top of the organization to fund and nurture these databases.