Month: February 2020

Technology’s Impact on Healthcare

Technology is transforming the way healthcare operates. The impact is not on one level but on many.  It is certainly a game-changer for the way communication happens and the way data is stored. Most importantly, it is truly enhancing the patient experience. Technology transforms the way patients are diagnosed and treated. It’s also transforming the way the business side is handled.

The true dynamo behind the great healthcare overhaul is mobile technology. These are the smartphones and tablets carried by doctors and nurses as they move between one location an another. Cloud technology provides on-demand access to any IT resource you can imagine. It also delivers resources previously unavailable. This blog will introduce some of these new resources. Because these resources make use of cloud computing, they can be accessed from any device anywhere on the planet where there’s an Internet connection. The added benefit; again, because it is on the cloud, is the flexibility and versatility of being able to scale up or scale back capacity as needed. Bandwidth is unlimited. Store as much as you want. Gone are the days of being frustrated with your workstation because it is slow.

There are 2 drivers behind this technology. One is to reduce costs. The second is to improve the quality of patient care.

There are more mobile devices than there are people on Earth. Clinicians are connected as never before. This means that medical professionals can immediately tap into, contribute to, and benefit from, a growing pool of global medical knowledge. At the swipe of a finger, a doctor can access the latest research on a given disease, learn about the latest drug, or clinical trial outcomes. They can benefit from the collective experience of colleagues worldwide.

Things are changing from the patient side as well. Patients are becoming increasingly accountable for their own health and well-being. They’re doing their homework on diseases and illnesses. They want access to their own data. In the June 13, 2017, Forbes magazine article How The Cloud is Transforming Healthcare, Khalid Raza writes, “providers must satisfy the demand for instant, top-quality access to healthcare services. Patients – who are accustomed to the 24/7 availability and service from online retailers and financial institutions – expect and even demand such access and services from their healthcare providers. People have become more involved in managing their own healthcare needs, which only complicates matters, and gravitate to the web for diagnosis, information, and treatments.”

Software companies have had the pulse on these industry-wide healthcare trends. These companies have responded with new technologies designed to significantly contribute to the flow of knowledge and the efficiency of future healthcare.  There are now multiple secure messaging technologies available to doctors who want to have a quick informal consultation with a colleague. These tools have many of the same features. For example, all communication is tracked and logged automatically.

Here are a few of the new technologies that are changing the face of medicine. And they’re all being facilitated by cloud computing in one way or another.

 

DIGITAL FLOWS
SPEED UP
DIAGNOSIS, PROGNOSIS & TREATMENTS

There are still thick heavy reference books collected throughout doctor’s offices and nursing stations. These mammoth books are collecting a lot of dust now. The reference books have probably been forgotten or left where they were simply for reasons of interior design. Now if a nurse or doctor needs a quick reference, they pull out their smartphone. Mobile apps enable clinicians to quickly dial into any information needed about drug interactions or complications associated with a particular condition.

 

The Med360 Mobile App

Med360 is a program that automatically collects every new publication matching your interests. It collects data from thousands of traditional open access journals and funnels it into your personal stream. A doctor has only to call up the app on his or her smartphone, do a quick scan of the screen, and know exactly what’s going on with the patient’s medication history-taking and reconciliation. Pharmacy pickups, dosage changes, and re-fills are presented in a clear interface on the clinician’s mobile device.

 

 

 

 

 

VAST AMOUNTS OF DATA

The February 2019 article in Nature Medicine reported on a program that used patient information such as symptoms, history, and lab results to diagnose common childhood diseases. According to the article, the system was given data on nearly 600,000 patients at a pediatric hospital in China. The results produced by the system were highly accurate.

In another February 2019 article, Cade Metz reported that Google is developing and testing systems that analyze electronic health records in an effort to flag medical conditions such as osteoporosis or diabetes. Similar technologies are being developed to detect signs of illness and disease just based on X-rays, M.R.I.s and retina scans. The main thing these innovations have in common is their reliance on neural networks. This is a breed of artificial intelligence that learns tasks largely on its own by analyzing vast amounts of data.

Computers can be programmed to recognize patterns amongst vast amounts of data. These patterns can be linked to specific conditions. These are patterns that would be difficult, if not impossible, for a person to notice. Huge amounts of data from medical imaging are fed into artificial neural networks. The program follows an algorithm. The computer then proceeds to learn on the job so to speak. The more data it receives, the better it becomes at interpreting the data.

This learning process is already being used in many applications. Computers learn to understand speech and identify objects this way. Self-driving cars can recognize stop signs. It can tell the difference between a pedestrian and a telephone pole.  Google has created a program to help pathologists read microscope slides to diagnose things like cancer.

 

Mobile devices are the key to tapping into knowledge flow streams.

KNOWLEDGE ACCESS

ON

ANY DEVICE ANYWHERE

The fact that everything is accessible on any device anywhere means patients can get medical help at the hospital, at the ambulatory center, and in the comfort of their own home. In the past, if you wanted to see the doctor, you’d physically have to travel to where the doctor practiced medicine and visit the doctor’s office or go to the emergency room.

Now, much of that care can appropriately be pushed into the patient’s home.

 

Telehealth is the distribution of health-related services and information via electronic information and telecommunication technologies. It allows long-distance patient and clinician contact, care, advice, reminders, education, intervention, monitoring, and remote admissions

Hospital at Home, a program at Mount Sinai, enables video visits. You can check-in, access monitoring tools, and input your vital statistics. Patients can do things like check their pulse, blood pressure, or weight. The information can then be sent to the patient’s care team for review and response.

In a May 10, 2019, Harvard Business Review article, Albert Siu and Linda V. DeCherrie report that “research has shown varying but clearly positive impacts on mortality, clinical outcomes, readmission rates, and cost. A 2012 meta-analysis of 61 randomized, controlled trials, for instance, found that the hospital-at-home patients had a 19% lower six-month mortality rate compared to hospitalized patients. Our research finds that patients who receive hospital-at-home care have fewer complications and readmissions; they also rate their health care experience more highly.”

Bruce Darrow, MD, Ph.D. and Chief Medical Information Officer at Mount Sinai in New York.

Bruce Darrow, M.D., Ph.D., cardiologist and Chief Medical Information Officer for Mount Sinai Health Systems says, “It’s empowering for the patient and it’s good for the clinicians too. The technology allows doctors to let the patients do the jobs they would want to do themselves.  Artificial Intelligence is going to be essential to healthcare. When we think about doing the work with patients at growing population levels effectively, A.I. technology is going to play an important role. If I’m a primary care doctor who is taking care of 2,500 patients, only 20 or 30 of those patients will come into my office on any given day. At the same time, there may be several at home who are at risk. Rather than combing through the entire list of 2,500 patients, if I have tools to look at the prior history of the patient along with their current vital signs, I can determine who I need to see first.”

 

Medical record systems are notorious for not communicating with one another.

Darrow goes on to say, “Electronic medical records have been challenging to connect to one another because of the way they were born. The original idea was not to generate a national patient identity that would allow the same patient to be identified as such from one system to another. There was no original standard for what the medical records would do and how they would interoperate with each other.

The government and the healthcare industry have recognized the problem. That’s where the work of the next few years will be. We’re making progress. At this point, I have patients who come to see me in the office. I can pull their information from a number of systems throughout  the New York area as well as nationwide.”

Telehealth

Telemedicine is the practice of caring for patients remotely when the provider and patient are not physically present with each other. This HIPPA compliant video technology enables clinicians to consult with their patients effectively. Patients can follow-up with their doctor through a video visit instead of making the trip to the hospital or clinician’s office. Patients can get an on-demand video visit with emergency trained doctors. A doctor can have virtual communication with a specialist. Or a stroke specialist can be transported in to participate in the care of an emergency room patient. All of these things are possible today.

 

The Main Benefit of VDI
VDI Planning: 4 Key Pitfalls to Avoid
What is VDI?

Virtual Desktop Infrastructure (VDI) enables virtualized desktops hosted on remote servers on the Internet.  Reducing the need for hardware while improving flexibility, VDI offers practical benefits as well as a hefty return on investment. There is a strong business case to be made. According to the IDC, “The Business Value of VMware Horizon,” of January 2016, there is a 5-year return-on-investment of 413 percent. On average, the virtualized desktop costs 71 percent less to buy, deploy, support, maintain, and use over a 5-year period. This is on a per-device basis. Users spend 76 percent less time on device application log-ins. VDI enables companies to make full use of human capital while preventing many IT-related issues. We need all the help we can get to unlock the massive human assets such as talent, empathy, and creativity. You know, the things computers aren’t that good at. There are indeed great advantages to moving to a DaaS environment. There are also many opportunities for making mistakes along the way. Let’s take a look at the 4 most common pitfalls associated with VDI migration.

A TechRepublic article cites a lack of planning as a major pitfall of VDI integration.  The article went on to report that companies failed to plan for enough resources. Don’t provision for today or tomorrow. Design an infrastructure that will serve your needs next year and for the years ahead. That article was from 2013. It is just as relevant today.

Decide what are the priorities in your VDI environment.

The problem with most VDI implementation is lack of planning. Internal stakeholders should begin with a comprehensive assessment of the IT environment. Also, consider the individual desktop environment. The VDI landscape has changed over the years. Planning and project management are the key to a successful VDI adoption. The initial steps start with an internal dialogue. It’s a good idea to bring in outside expert advice early in the process. Each company is unique. There are different demands and different expectations. The time and effort put into VDI planning will pay incredible dividends for years.

Here are a few of the most common hurdles. They can be overcome when identified early.

VDI Planning
A Common problem with VDI planning is wanting to include everything.
Don’t Try to Do Everything at Once

The first common issue in rolling out a VDI initiative is trying to do too much at once. This applies to both large and small environments alike. VDI does not look the same at any two companies.

Don’t try to include every attractive feature in your initial implementation. Be focused on meeting key objectives. And be selective. Understand the major features and benefits of VDI. But don’t try to include everything in the beginning. This will only slow down the process. It will also distract you from your key objectives. A white paper by VMware recommends taking a step back. Consider what you’re trying to do. Do this before you even think about IT requirements. Instead of diving straight into technical requirements, such as numbers of servers and sizing of WAN links, begin by exploring user needs, business drivers, and special requirements. These special requirements might include things like: compliance issues; high availability; disaster recovery plans, or even the need for the business to rapidly onboard large numbers of new users due to mergers or acquisitions.

Don’t get stuck on the age-old VDI question. For example, using non-persistent versus persistent desktops in their initial deployment.

A company may never deliver a useable VDI solution if they allow themselves to get stuck on an idea. Let’s say that you determine 99% of its VDI desktops will be non-persistent. Well, you need to know that you’re going to spend countless OpEx and CapEx funds.

Stay Focused on Key Points
Zero in on what’s most important to you in a VDI environment.

Narrow down what you need in the planning stage to get VDI in a solid usable state. Set-up your VDI on a set of lean criteria. You can make additions as you go.

Do an Effective Initial Assessment

The next hurdle is company-specific. It is also often overlooked due to the upfront cost and time. I am referring to the VDI assessment that should be a part of the planning. The VDI assessment is the discovery phase of the project. It will help you isolate and focus on what is most important for your business.

Identify who will be using the VDI solution. The assessment is two parts: discussion and analysis. Be sure the process includes all the stakeholders including those who will be using the virtual desktops. Getting them involved early in the design process will help manage expectations. It will also go a long way in nurturing the acceptance of the resulting VDI environment.

Bring All the Brains to the Table
Bringing all the brains to the table will ensure the existing infrastructure is understood and all solution options are on the table.

Let’s use the example of an HR group that will be using VDI during the initial deployment. There is an initial interview. The agenda includes setting expectations of VDI. Begin by looking at how the company currently uses the computer environment.

Discussions along these lines will establish some parameters.
Do they generally only use a combined set of 4 applications? Do they work at varied times throughout the day? Do they only need a web browser and the ability to email clients on the company network?

You also need to do some data gathering of what traditional desktops are doing during the day. What are the applications used? What is needed for the machines to operate?

Most PCs are oversized with wasted resources. VDI is all about compute and storage density. Determining accurate sizing needs equals more cost savings. There are several tools that can do the 2nd part of this equation but don’t overlook the first.

Don’t Overlook Management and Support Responsibilities
This third point is around IT staff.

Who will be managing the new environment once the consultants have departed? Will you share this duty between existing desktop/infrastructure teams? Or will a new team arise to manage the entire solution? Decide this early on.

Manage a VDI environment requires an engineer who understands several key technologies. They sound know how these technologies affect the virtual desktop. These technologies include but are not limited to:

Networking  
Know how users connect to the virtual desktop. Know where to troubleshoot problems like lost connections or poor performance

Compute/Infrastructure
Deep understanding of hypervisors and server infrastructure, depending on the vendor of choice

Security
Knowledge of security products will be inside the virtual desktops and in the network path of VD. This is for troubleshooting purposes.

Desktop Engineering
Basic knowledge for customizing Windows installations and troubleshooting.

Additionally, there are several other ancillary technologies that come in handy. These technologies include DNS, Active Directory, Application Packaging/Delivery, Load Balancing, and Storage.

These skills can come from various class training offerings. Many should come from experience. Knowing how all these different technologies work together in your environment is critical.

Larger companies own many of these technologies.
Separate teams manage them. It is crucial that all the stakeholders be aware of the impact of VDI.

Know who has ownership of the new VDI systems. Make sure there is buy-in from across your IT organization. This is important to establish in the beginning. Everyone needs to be on the same page. This will make training easier. can occur for those needing to ramp up.

This ownership and buy-in include first-line defenders like your typical service desk team. Let them know they’re responsible to field certain common VDI related issues as they come in. Provide education and resources to support them. Service and support is the key benefit of partnering with seasoned VDI consultants.

Don’t Forget the User Experience

As VDI deployment comes together, don’t forget about the user experience.

The User Experience Is Important
User experience is the final litmus test. How the user feels about the experience means the success or failure of VDI or DaaS.

Consider how things were before VDI. Chances are, your employees have been using similar pieces of hardware. They know how their workstation machines perform every day (good or bad). They’ll compare the new VDI environment to what they had before.

This goes back to the assessment stage. Understanding the proper-sizing and performance of each machine is important. It can mean the difference between successful adoption and one that isn’t. It’s also more than that.

If a user now has to login twice to access their Virtual Desktop they will complain. If the machine hangs when opening a video conference they will complain. If patches cause reboots on different days, they will complain. You want to make the change over to VDI as seamless as possible.

The experience should be better, not equal or worse than on a traditional desktop. Make sure you plan to provide the expected performance of each workstation. Allow for a tailored storage solution that is intelligent and optimized for VDI. Consider network crashes. If for whatever reason, they can’t access their virtual desktops, this can also be a problem. Here’s the point. Outside factors can contribute to the total experience on a Virtual Desktop. Many of these factors will be beyond your control.

The successful adoption of VDI means user acceptance. Deliver a desktop-like experience. It means proving the training and support necessary. Company-wide buy-in is key to the success of the whole program. It all begins with planning and making sure you have every brain at the table when that happens.

Ransomware Targets Healthcare
The Healthcare Ransomware Epidemic: How to Protect Your Patients
The Problem is Becoming a Crisis

Data breaches are happening at an alarming rate. In fact, the threat of ransomware attacks has become elevated to crisis levels. While there’s increased awareness, attacks are becoming more sophisticated. A variety of large and small organizations are being attacked. No one is immune. The healthcare industry has been and continues to be, prime targets. And for good reason. Healthcare organizations are considered low-hanging fruit by cybercriminals. Hackers know healthcare centers are notorious for having inefficient security. Most hospitals don’t have procedures in place to restore a network once locked by ransomware. Most applications in Hospitals have little or no network segmentation. There are no firewalls between workloads. Basic security protocols are not in place.

Besides the alarming ransomware statistics, there are some attacks that never get reported. The U.S. Department of Health and Human Services experienced 52 data breaches in October. Last year, hackers stole over 38 million medical records. These sobering statistics have made the healthcare industry take notice. Many healthcare organizations are taking steps to increase cybersecurity. But more can be done. This article will take a look at some of the more recent ransomware cases. We’ll look at some mistakes that were made in dealing with cyberattacks. And we’ll offer ways to improve cybersecurity and protect patient data moving forward.

The consequences of a data breach reach far beyond the breaking news story. There’s more to it than the short news article that appears on your computer screen. A single attack can close down an organization for good. It can happen in a few minutes. The consequences can have long-lasting implications. This is particularly true for the healthcare industry. Sure, the reputation of the healthcare center gets flushed down the toilet, but there’s a real impact on the patients. These incidences are not merely expensive inconveniences. Cyberattacks disrupt the entire eco-system of the institution. It puts people’s health, safety, and lives at risk.

 

Healthcare Worker Distressed by Ransomware Locking up IT systems
Security breaches will cost healthcare organizations $6,000,000,000 this year.

 

Often, the healthcare center gets victimized twice. First, there is a ransomware attack. Second, the healthcare system becomes the target of a class-action lawsuit from a community of angry patients and their families.

Consider the New Scientist article about the 2016 attack on the Hollywood Presbyterian Medical Center. It was a Friday afternoon when malware infected the institution’s computers. The attack seized patient data and prevented the staff from further communication. The date was February 5. The same day computer hackers tried to steal 1 billion from the Federal Reserve Bank of New York. It all happened in a matter of seconds. Medical records had to be kept by using pen and paper. They used old fax machines. Patients were sent to other hospitals, operations canceled. The medical center was back on-line after a 2-week standoff. But not until after paying a ransom of 50 bitcoins (the equivalent of $17,000 at the time).

Malware can infect the entire computer system. Someone clicks on a link to a booby-trapped website or opens an attachment in a phishing email. Immediately, malicious malware gets to work encrypting the files. Some malware can immobilize entire IT infrastructures. If data is backed up and you get an attack of malware or something, you can always go back to yesterday’s data.
Healthcare targets often have their backs against the wall during a cyberattack. Because they don’t have their files backed up.

In most cases, a ransom is paid. The hackers deliver the decryption key. And medical centers are able to decrypt the seized files. The Hollywood Presbyterian Medical Center was straight forward. They handled the crisis as best they could. See the above comments about using pen and paper. They negotiated a lower ransom and their data was returned. More recent victims haven’t been so lucky.

Medical malpractice has been part of the healthcare landscape since the 1960s. Now there is an additional risk of medical malpractice during ransomware attacks. If the ransomware attack affects the patient in any way, there will be repercussions.

Doctor Using Tablet
While only a few healthcare systems have policies around using mobile devices, there is a growing movement to regulate such devices.

Take the cyberattack on LifeBridge Health systems. Seven months after the incident, the Baltimore-based health system faced another problem. A class-action lawsuit was filed against them. The lawsuit claimed negligence on the part of the medical center. It also accused LifeBridge of waiting 2 months before informing the affected patients.

LifeBridge had to respond to the allegations. The organization contracted a national computer forensic team to investigate the attack. Patients were offered credit monitoring and identity protection services.

Clearly there are basic mistakes made that contribute to breaches. Mistakes can allow the infiltration to happen in the first place. Resolving a ransomware situation is stressful. People can do things that t make the situation worse.

Ransomware Recovery Mistakes

Health Management Concepts in Florida was attacked with ransomware. The official report was made on August 23. HMC learned about the incident on July 16. The ransom was paid. The attackers delivered the decryption keys. The hospital IT administration immediately took steps to decrypt the data. To their horror, the HMC staff realized they made the problem worse. They accidentally sent files containing patient information to the hackers.

UnityPoint Healthcare had the misfortune of suffering two security breaches in 2018. The second attack compromised the data of 1.4 million patients. At least, that’s the official tally. A series of phishing emails had been made to look like they were from a top executive within the company. An employee fell for the scam. It gave hackers the opportunity needed to penetrate the entire system.

The protection of healthcare assets is not just a matter of protecting patient information but protecting the patients themselves.
Recognizing the Risk is the First Step Toward Protecting Patient Information

The onslaught of cyberattacks against healthcare is relentless. There are inspiring stories of medical centers fighting back. They’re defending themselves against nefarious cyberattacks. They’re saving lots of money. Increasing their efficiency. And better protecting their patients.

One such story belongs to the Interfaith Medical Center of Brooklyn, New York. It’s a 287-bed non-profit teaching hospital that treats more than 250,000 patients every year. They were able to avoid malware outbreaks. Their proactive approach enabled them to detect and respond immediately to advancing threats. Their strategy involved an assessment of threats and implementation of policies and procedures.

Incident response time is critical. Measure it with a stopwatch, not a calendar. All the segmentation in the world isn’t any good if the door won’t be closed in time. Their program was successful. It identified malware infections long before they had a chance to become a problem. They were even able to identify a malware-infected medical device after it came back from a repair vendor.

The Interfaith Medical Center anticipated a ransomware attack and took steps to prepare for it. In a September 3, 2019, Healthcare IT News article, we learn how Christopher Frenz – the VP of Information Security protected the non-profit’s IT system. “One of the ways I approached this was simulating a mass malware outbreak within the hospital, using a custom-developed script and the EICAR test string. Running the script attempted to copy and execute the EICAR test string on each PC within the organization to simulate the lateral movement of a threat within the hospital. Exercises like these are great because they help an organization identify what security controls are effective, which controls are ineffective or in need of improvement, how well or not the staff response to an incident will be, and if there are any deficiencies in the organization’s incident response plan,” he explained.

Christopher Frenz, Interfaith Medical Center's VP of Information Security
Christopher Frenz, VP or Information Security at Interfaith Medical Center, led the charge with his zero trust architecture that protected the network from cyberattacks and saved the healthcare system millions of dollars.
“We have successfully avoided malware outbreaks and are actively detecting and responding to advanced threats, long before they impact privacy or operations.”

Christopher Frenz, Interfaith Medical Center

 

The article ends with some excellent advice from Frenz. “Healthcare needs to begin to focus on more than just compliance alone, as it is far too easy to achieve a state where an organization meets compliance requirements but is still woefully insecure. Organizations need to put their security to the test. Pick solutions that can empirically be shown to improve their security posture.”

 

There are basic steps healthcare organizations can take to minimize their risk of ransomware attacks. Learn as much as you can about ransomware attacks. Consider all possible points of entry. Where is your IT system vulnerable? Medical software used for patient data has numerous vulnerabilities. Healthcare cybersecurity statistics by Kaspersky Security Bulletin found easy access to 1500 devices used by healthcare professionals to process patient images such as X-rays.

 

Improving the cybersecurity of a healthcare organization, whether large or small, has two parts. One part has to do with the design and implementation of the IT system entire (i.e. whether-or-not there’s back-up and disaster recovery features in place). The other part has to do with your human capital.

 

Malware can be introduced from any number of locations along with your network. Often the attack is designed with multiple points of entry. It could be phishing emails where an employee is tricked into clicking on something that is booby-trapped. It could be a bogus email from what looks like an upper-level executive but is actually from a hacker.

 

ON-GOING EDUCATION AND REFRESHER COURSES
Healthcare Employees Being Educated on Cyber Security Procedures
Healthcare employees should have regular and comprehensive cyber threat education. This enables them to avoid falling into traps that can trigger ransomware. It also serves to establish a strong security culture.

Human beings make mistakes. This is especially true in the busy high-stress environments of hospitals. Or in situations where doctors, nurses, and orderlies work extended 10 to 12-hour shifts. People have to be educated about the risks of cyberattacks and what forms such attacks might take. It’s easy for a rushed employee, at the tail-end of their shift, to unknowingly click a file, download an unauthorized software, or be tricked into loading a contaminated thumb drive. There are basic security processes that should be implemented. These are things like creating strong passwords and changing them at regular intervals. Duel factor protection is also a good idea.

Cybercrooks study the vulnerability of humans. Hackers continually figure out ways to exploit human traits and their gullibility. Through social engineering tactics, cyber attackers design pathways to plant ransomware or get a foothold in an information system.

 

SECURITY IS NOT ABOUT QUICK FIXES

Take the time to ensure the staff and vendors are mindful of what they’re doing. Review policies and procedures regarding handling patient data. Review how to avoid security incidences. As we have seen, any data breach has legal ramifications. There needs to be a systematic response that is carefully considered and forged into a process. Additionally, partner with the right vendor who can design and provide a holistic security solution that will protect your patients.

What is the Cloud?

How many of us really know what the cloud is? Oh sure, we know that the cloud involves storing and accessing stuff via the Internet, but do we understand the powerful transformational nature of cloud computing technology. Do we appreciate how it has changed and continues to change, the way we live and work?

Not that long ago if you mentioned the cloud, most people thought you were talking about the weather. As recently as 2012, Wakefield Research discovered that 51% of the people surveyed, most of whom were Millennials, thought that storm conditions could interfere with cloud computing. Later that same year, Business Insider reported only 16% understood the cloud to be a network of Internet-connected devices to store, access, and share data. So if you don’t know that much about the cloud, don’t feel bad. You’re not alone.

Most people, if they think of the cloud at all, know it simply as a place to keep iTunes, archive our favorite movies, or family pictures and videos. Consumers know the cloud as a storage service offered by Apple. Our knowledge of iCloud is usually associated with the company’s invitation to add more space. Then there’s Netflix. Millions of people access feature-length movie titles stored and delivered on-demand via cloud technology. Do you store and share large files via DropBox? Does your office use Microsoft Office 365?

This article won’t be describing the cloud per se. Nor will it attempt to explain the various types and configurations of clouds. But rather a high overview of how cloud technology transforms companies and whole industries. It will explore the way cloud technology changes the way we work with each other all over the world. Technology growth is accelerating at multiplying rates. This acceleration is due to all the technologies blending together into the cloud.

 

The Supernova
The Cloud is a Supernova

 

We use a soft fluffy metaphor like the cloud, but “the cloud” paints a misleading picture in our minds. The Pulitzer Prize-winning writer Thomas L. Friedman, in his book, THANK YOU FOR BEING LATE, prefers to call the cloud “the supernova.” A term originated by Mircosoft computer designer Craig Mundie. Why refer to it as “the supernova” and not “the cloud.” In the world of astronomy, a supernova is the explosion of a star. It’s a huge astrological event; in fact, the largest explosion that takes place in space.

So too, the cloud is an incredible release of energy. The energy reshapes every man-made system that our society has built. And now, every single person on the planet who can access the Internet can tap into its power. The only difference, Mundie points out, is that a star’s supernova only happens once. The computer supernova keeps releasing energy at an accelerating rate. It’s interesting to note that the components that make up the cloud continue to drive down in cost. The cost goes down while performance keeps going up.

Just as the discovery of fire was a game-changer back in the Stone Age, and Electricity lit the way from one century to the next in the late 19th Century, the cloud has fundamentally changed the modern world. There are more mobile devices on the planet than there are people. Soon everyone on the planet will be connected.

Go with the Flow

The cloud has large amounts of digital information moving in every direction. The information travels up and down. The white-water rapid current moves fast and with equal energy. You have to learn to go with the flow if you’re going to thrive. Like maintaining constant homeostasis, you have to go with the flow to keep your balance. You’ll be better equipped to look ahead, predict trends, and respond to the ever-changing market.

The Flow of Knowledge Stocks

In the past, the traditional idea was to go to college. Get an education. Find a job where you can apply that education. Show up. Do the work and you’d be fine. You’d be set for life. The focus was on one person having a stock of knowledge. Today, the focus has shifted to the flow of knowledge. As pointed out in the 2009 Harvard Business Review article “Abandon Stocks, Embrace Flows,” it’s no longer about having knowledge.

As the world accelerates knowledge tends to become outdated at a faster rate. The premium shifts to a focus on updating knowledge. Choice marketable characteristics will be a high level of curiosity, and staying in touch and maintaining the pulse on the latest advancements. As the world accelerates, stocks of knowledge depreciate at a faster rate. This is true for items you buy as well. Notice how quickly product life cycles have compressed. Even the most successful products fall by the wayside quicker than before. We have to continually learn by participating in relevant flows of new knowledge. And it’s not just a matter of diving into the flow when we feel like it. Participation and benefiting from this flow of knowledge requires that we must also contribute to it on an on-going basis.

This is the world of the cloud. This is where workspaces connect globally. Ideas and knowledge are exchanged freely. The so-called little guy can compete with the big guy. In the March 2016 study “Digital Globalization: The New Era of Global Flows” by the McKinsey Global Institute, we see in great detail how the world is more interconnected than ever.

Many enterprise companies are taking advantage of this interconnectivity. They’re leveraging the technology in order to take advantage of the knowledge flows moving around the planet. For example, Friedman describes in his book THANK YOU FOR BEING LATE, how General Electric supplements its internal resources of engineers to run global contests to see who can come up with the best design solutions. One such contest received 697 entries from companies and individuals all over the world.

It’s All About Interconnectivity

This interconnectivity is expanding “instantaneous exchanges of virtual goods.” The cloud enables digitized financial flows to happen at unfathomable rates. The science journal Nature published “Physics in Finance: Trading at the Speed of Light.” It presents an industry driven by ever-increasing speed and complexity. The article reports that more than 100,000 trades occur in less than a second. That’s for a single customer.

High-frequency trading relies on several things. It needs fast computer algorithms for deciding what and when to buy and sel. Live feeds of financial data are needed. And high-frequency trading also requires about $15,000 a month to rent fast links.

Moving faster also increases the likelihood of mistakes. In 2012, a flaw in the algorithms of KNIGHT CAPITAL – one of the largest U.S. high-frequency firms, caused a loss of $440 million in 45 minutes. The algorithm accidentally bought at a higher price than it sold.

Data speedbumps act like traffic cops slowing down the flow of traffic.

Some trading firms established a way to keep the traffic from moving too fast. They introduced a kind of digital speed bump. Slowing down the flows of digital traffic by 350 microseconds. Apparently this was all time traders needed to benefit from faster feeds. The inclusion of a speed bump, all 350 microseconds worth, meant we’ve already surpassed the optimal speed for trading.

Speed & Complexity Are Free

Because information moves much faster now, global markets become more interdependent on each other. Remember when China made some financial missteps in 2015. It caused a ripple effect that stretched across the planet. Americans felt it immediately. On August 26, 2015, CNN.com reported:

“The American stock market has surrendered a stunning $2.1 trillion of value in just the last 6 days of market chaos. The enormous losses reflect deep fears gripping markets about how the world economy will fare amid a deepening economic downturn in China. The Dow, S&P 500, and Nasdaq have all tumbled into correction territory. It is their first 10% decline from a recent high since 2011. The dramatic retreat on Wall Street has been fueled by serious concerns about the fallout of China’s economic slowdown.”

PayPal has become one of the most important drivers of digital finance. The company set out to democratize financial services by enabling every citizen to move and manage money. The explosion of smartphones gave users all the power of a bank branch at their fingertips. The incremental cost of adding a customer is nearly zero. What is common-place for Americans to do, send money to someone, pay a bill, or get a loan, was now simple, easy, and nearly free for 3 billion people around the world. These were the people who would have to stand in hours to change their currency and stand in another line for hours to pay a bill. PayPal doesn’t rely on FICO scores the way a traditional bank or credit card company does. Instead, they use their own big data analytics based on your actual transaction activity on their site. This gives them a more accurate picture of your creditworthiness. The result: instant loans to more people around the world with a higher rate of payback. PayPal is one of the companies eliminating the need for cash. PayPal is also experimenting with “blockchain” for validating and relaying global transactions through multiple computers.

Cloud technology has brought with it a period of adjustment. We need time to absorb, learn, and get used to the idea of working differently. The cloud will make economies measurably more productive. Because of it Individuals, groups, and organizations are now on a level playing field. These individuals, groups, and organizations can shape the world around them in unprecedented ways. And they can do it with less effort.

Leverage & Synergy

There has never been a better time to become a maker, an inventor, a start-upper or an innovator. It’s leverage and synergy in action as never before.

Leveraging Technology

 

Consider some of these examples:

Uber

The world’s largest taxi company owns no taxis

FaceBook

The most popular media owner creates no media

Alibaba

The world’s most valuable retailer has no inventory

Airbnb

The largest accommodation provider owns no real estate

THE DOUBLE-EDGED SWORD

Technology has always been an amplifier of the best and worst of humanity. It tends to magnify our psychological and spiritual condition both good and bad. Cloud technology is a double-edged sword. On one hand, it empowers the individual, groups, and organizations as never before. Companies communicate faster and more fluidly. Small boutique shops can become multi-national enterprises in a short amount of time. More brains are connected globally. The smallest voices can be heard everywhere for the first time.

Alternately, technology can be used to belittle and disempower. Just as the cloud enables builders and makers, it also gives power to breakers. One person can do more damage more cheaply and more easily. Take Navinder Singh Sarao for example. Sarao, operating from one computer on a network connection out of his parent’s house in West London, single-handedly manipulated the U.S. Stock Market into losing a trillion dollars in less than a half-hour. He “spoofed” the Chicago Mercantile Exchange into setting off a terrible chain reaction. Spoofing is an illegal technique of flooding the market with bogus buy and sell orders so that other traders, both human and machine, are fooled into helping the perpetrator buy low or sell high. He had developed his algorithms to alter how his orders would be perceived by other computers.

Big forces can come out of nowhere and crush your business. You’ll never see them coming. The mobile broadband-supernova is a double-edged sword. How it’s used depends on the values and tools we want to put into place.

WE BECOME WHAT WE BEHOLD
We shape our tools and then our tools shape us.

In summation, the cloud, our technological broadband-supernova, is here to stay. It won’t be the same cloud a few months from now, but it’s here to stay. And things will continue to accelerate. It’s going to be difficult for many to keep up. Keeping up may be one of the great challenges facing society in the decades to come.

In answering the question, “Why is the world changing so fast?” Dr. Eric C. Leuthardt states in his “Brains and Machines” blog:

The reason for accelerating change is similar to why networked computers are so powerful. The more processing cores you add, the faster any given function occurs. Similarly, the more integrated that humans are able to exchange ideas the more rapidly they’ll be able to accomplish novel insights.

Different from Moore’s Law, which involves the compiling of logic units to perform more rapid analytic functions, increased communication is the compiling of creative units (i.e. humans) to perform every more creative task.

A great primer for anyone interested in understanding the transformational power of cloud technology is Thomas L. Freidman’s 2016 book THANK YOU FOR BEING LATE: AN OPTIMIST’S GUIDE TO THRIVING IN THE AGE OF ACCELERATIONS.