Without a doubt one of the hot topics in Laboratory informatics for a number of years has been the use of the cloud for hosting Laboratory Information Management Systems (LIMS) and other lab-based systems. However, is the cloud and whatever cloud model is used or recommended (i.e., Platform as a Service, Software as a Service, Infrastructure as a Service, or whatever other four-letter acronym someone comes up with) really the be-all and end-all of lab informatics?
The Benefits of Cloud
The use of the cloud to host lab informatics solutions can have real benefits. These are well documented and include changing from a capital cost model to a recurring cost model if you choose to pay on a subscription basis, the reduced need for in-house IT resources to support systems and solutions, the use of third-party infrastructure, automatic and guaranteed updates of operating systems and cyber security software and, potentially, automatic update of the LIMS solution as new releases become available. In addition, having a web-based solution hosted in the cloud can make it easier to manage user access and make collaboration within and between organisations simpler. The cloud can also make it easier to access data from multiple databases making it easier for data scientists to analyse large data sets from disparate sources; a key to realising the value of your data assets. However, it is always worthwhile making sure that these benefits will be of value to you, and your organisation, and any potential risks are identified and minimised.
Understand the Risks
Any organisation considering a cloud-based implementation must ensure that the hosting organisation can address any concerns they may have, or risks they have identified, about third-party hosting. Reputable hosting organisations will have no difficulty doing this. However, it is worthwhile ensuring that cloud service providers have the required high levels of physical and virtual security in place and the level of guaranteed uptime should be checked. In regard to uptime, organisations must also identify the levels of uptime they need; the higher the guaranteed uptime the more it will cost, so an informed choice needs to be made. Organisations must also understand how software upgrades are applied, especially to the solution itself. Depending on the model chosen some suppliers will dictate when upgrades occur, potentially limiting the time available for users to test, validate and be trained on the new software, and forcing an upgrade on the users when it is not wanted. One other key area to check is access to your data. If you choose to discontinue your relationship with the supplier, will you be able to get access to, or a copy of, your own data and what does the supplier do with the backups and copies of your system and associated data they may hold?