Blog

The Coronarvirus Tests Global Readiness for Remote Work
As the threat of a coronavirus pandemic wipes away trillions of market value dollars, the largest mass exodus from the traditional office is underway.
The coronavirus threat pushes the question, “Are we ready to have our employees work from home?” Organizations want to do whatever they can to help contain the spread of the virus.

One of the top healthcare conferences of the year HIMSS canceled at the last minute. Everyone knows why. The canceled HIMSS conference was only the first of a series of conference cancellations this month. How many more conferences are going to be canceled. Only time will tell. A click survey online shows that Google, Intel, FaceBook and Twitter have canceled many of their conference plans. The South by Southwest, or SXSW Conference, has not yet buckled under pressure to cancel.

Andrew Keshner reports in a MarketWatch article that, “As the Coronavirus spreads, companies are increasingly weighing if they should, or even can, have workers do their jobs from home.” The article goes on to announce that Twitter told its 5,000 employees around the world to work from home. The BBC News reports Twitter’s head of human resources Jennifer Christie said, “Our goal is to lower the probability of the spread of the Covid-19 coronavirus for us – and the world around us.” Twitter has been developing ways for employees to work from home. Their mandate moving forward is to enable anyone, anywhere to work at Twitter. Twitter’s began moving to a more mobile workforce before the coronavirus. Now, many companies are taking steps to enable employees to work from home. Asian-based organizations, the ones that could, have already implemented work-from-home options. Several giant multi-national companies such as Citigroup have restricted travel to Asia.

The Best Advice: Plan and Prepare

The media seems to report on the idea that there are only 2 states you can exist in. One is ignorant bliss. The second is a state of panic. There’s a wide territory between those two extremes. People should not panic. They should be aware of what’s going on, have an appropriate level of concern, and respond. People need to consider what’s going on so that they can take action. Managing risk is an important part of life. It’s also an important part of leading a business. Understand the risk. Understand what might happen, and make decisions to keep business moving.

Centers for Disease Control, or CDC, has announced they can’t contain the coronavirus. So that means we’re down to implementing mitigation strategies. This means the CDC is going for non-pharmaceutical interventions (NPIs). This translates to things like closing schools. Mitigating strategies also include preventing people from attending large gatherings. If necessary, issue self-imposed quarantine orders. If self-imposed quarantines don’t work, CDC will issue a contained quarantine order. This means there’s no choice in the matter.

The CDC recommends that companies encourage telework. “For employees who are able to telework, the supervisor should encourage employees to telework instead of coming into the workplace until symptoms are completely resolved. Ensure that you have the information technology and infrastructure needed to support multiple employees who may be able to work from home.” There have been technologies enabling employees to work remotely for some time now. And the interest has grown over the years. It has been a matter of just deciding to offer that flexibility to your employees. Managers have to determine the ratio of working in the office with working at home.  It’s more of a leadership decision rather than any limitation of the technology. But the coronavirus threat will certainly act as a catalyst accelerating the adoption of remote collaboration tools. Most companies will be forced to have their employees stay home.  Microsoft has announced free upgrades. Office 365 users can now make full use of the video conferencing and recording features of Microsoft Teams.

 

Businesses can replace in-person meetings with video and increase networking options. Now is a good time for businesses of all kinds to start preparing. If you don’t have the infrastructure already in place, start planning it. Most organizations are not prepared for wide-spread enablement of remote departments. Many are still evaluating requirements and solutions. Workers can work as effectively at home than in the office. Research indicates employees are even more productive working from their home offices.

Moving to The Cloud Has never Made More Sense Than Now

Cloud technology and remote workspaces enable organizations to be flexible with their staff. It’s also an attractive incentive while recruiting talented employees. Astute business leaders want to be in a better position to offer remote collaboration tools to their employees. They want to establish parameters in which work-from-home culture thrives. Jennifer Howe, VP of SMMA an architectural firm in Boston, and acting president of the ACEC Massachusetts said,” Remote workspaces are invaluable these days. You can’t recruit and retain talent without that kind of flexibility.”

A recent article on the Fortune website calls it the “world’s largest work-from-home experiment.” There are millions of businesses all over the world trying to stay productive amidst this growing crisis. The article goes into detail on the level of upheaval for companies. This is particularly true in Southeast Asian countries. “One of the most unsettling factors for employees is the rapidly-changing impact of the virus. It is prompting daily changes in corporate directives. We’re seeing that kind of impact in the states as more and more cities declare a state of emergency.

 

A giant experiment is underway to see how well new technologies can enable successful mass remote working for employees.

 

Managers worry the exodus from the office will lower productivity. There have been many studies done to support that the exact opposite is true. Productivity doesn’t go down. It goes up. The 2017 Stanford University Research is often quoted. That study found a 13% increase in productivity. A study conducted at the U.S. Patent and Trademark Office showed remote workers had a 4.4% increase in output. The consulting firm Deloitte did a recent survey that found 82% of white-collar workers using flexible work options.

 

Unlike companies that are designed from the start to hire work-from-anywhere employees, traditional in-office companies have to decide how this will work. Management has to set parameters on how remote work happens.
What Does Remote Work Look Like?

Unlike companies that are designed from the start to hire work-from-anywhere employees, traditional in-office companies have to decide how this will work. Management has to set parameters on how remote-work happens. They have to communicate to their employees what the expectations are. How will the team stay in contact with each other throughout the day? What is the level of responsiveness needed? Does your staff need to access robust programs like Autocad, Maya 3D, or Adobe After Effects? If so, then how, on a technical level, is that going to happen? For example, GPU hungry programs will need to be hosted on a virtual server. The work-in-progress files will have to be stored in some central location. This is also something that isn’t accomplished overnight. Now is a good time to start having those discussions.

The worst thing you could do is not do anything. Business leaders shouldn’t ignore the situation as it continues to escalate. Ask yourself, if this continues, would your company be able to operate productively. To what extent will your company be forced to stop its activity altogether?

At some point, we are all going to enter the coronavirus tunnel and make it through to the other side. The collective experience will force us to redefine the way we work. We will consider how we interact with each other. Who operates as a self-starter? Who needs closer supervision?

Alvin Toffler was a writer, businessman, and futurist He envisioned the digital revolution long before it happened and foresaw the remote workforce as an inevitable 21st Century trend.

The idea of remote work is not a new one. It goes back 50 years. Futurist writer Alvin Toffler wrote about remote work in his 1980 book THE THIRD WAVE. “When we suddenly make available technologies that can place a low-cost “work station” in any home, providing it with a “smart” typewriter, perhaps, along with a facsimile machine or computer console and teleconferencing equipment, the possibilities for home work are radically extended.”

Cloud technology enables a home computer…a “low-cost workstation” as Toffler calls it, or any mobile device for that matter. The home computer, smartphone, or tablet essentially serves as a dumb terminal. The processing power actually comes from a virtual desktop. For all practical purposes, it’s just like working from your office. You have access to the same emails, the same software applications, and the exact same files.

Right now, the coronavirus is forcing us to reconsider work-from-home scenarios. Moving personnel to a more comfortable and safer work-from-home environment has its benefits. For some businesses, this means building some kind of infrastructure.

I’d like to close with a question posed near the end of the Forbes article. “If you are an employer and you have the power to offer greater freedom to your workers, should you not being thinking about how to do so?”

 

 

 

Technology’s Impact on Healthcare

Technology is transforming the way healthcare operates. The impact is not on one level but on many.  It is certainly a game-changer for the way communication happens and the way data is stored. Most importantly, it is truly enhancing the patient experience. Technology transforms the way patients are diagnosed and treated. It’s also transforming the way the business side is handled.

The true dynamo behind the great healthcare overhaul is mobile technology. These are the smartphones and tablets carried by doctors and nurses as they move between one location an another. Cloud technology provides on-demand access to any IT resource you can imagine. It also delivers resources previously unavailable. This blog will introduce some of these new resources. Because these resources make use of cloud computing, they can be accessed from any device anywhere on the planet where there’s an Internet connection. The added benefit; again, because it is on the cloud, is the flexibility and versatility of being able to scale up or scale back capacity as needed. Bandwidth is unlimited. Store as much as you want. Gone are the days of being frustrated with your workstation because it is slow.

There are 2 drivers behind this technology. One is to reduce costs. The second is to improve the quality of patient care.

There are more mobile devices than there are people on Earth. Clinicians are connected as never before. This means that medical professionals can immediately tap into, contribute to, and benefit from, a growing pool of global medical knowledge. At the swipe of a finger, a doctor can access the latest research on a given disease, learn about the latest drug, or clinical trial outcomes. They can benefit from the collective experience of colleagues worldwide.

Things are changing from the patient side as well. Patients are becoming increasingly accountable for their own health and well-being. They’re doing their homework on diseases and illnesses. They want access to their own data. In the June 13, 2017, Forbes magazine article How The Cloud is Transforming Healthcare, Khalid Raza writes, “providers must satisfy the demand for instant, top-quality access to healthcare services. Patients – who are accustomed to the 24/7 availability and service from online retailers and financial institutions – expect and even demand such access and services from their healthcare providers. People have become more involved in managing their own healthcare needs, which only complicates matters, and gravitate to the web for diagnosis, information, and treatments.”

Software companies have had the pulse on these industry-wide healthcare trends. These companies have responded with new technologies designed to significantly contribute to the flow of knowledge and the efficiency of future healthcare.  There are now multiple secure messaging technologies available to doctors who want to have a quick informal consultation with a colleague. These tools have many of the same features. For example, all communication is tracked and logged automatically.

Here are a few of the new technologies that are changing the face of medicine. And they’re all being facilitated by cloud computing in one way or another.

 

DIGITAL FLOWS
SPEED UP
DIAGNOSIS, PROGNOSIS & TREATMENTS

There are still thick heavy reference books collected throughout doctor’s offices and nursing stations. These mammoth books are collecting a lot of dust now. The reference books have probably been forgotten or left where they were simply for reasons of interior design. Now if a nurse or doctor needs a quick reference, they pull out their smartphone. Mobile apps enable clinicians to quickly dial into any information needed about drug interactions or complications associated with a particular condition.

 

The Med360 Mobile App

Med360 is a program that automatically collects every new publication matching your interests. It collects data from thousands of traditional open access journals and funnels it into your personal stream. A doctor has only to call up the app on his or her smartphone, do a quick scan of the screen, and know exactly what’s going on with the patient’s medication history-taking and reconciliation. Pharmacy pickups, dosage changes, and re-fills are presented in a clear interface on the clinician’s mobile device.

 

 

 

 

 

VAST AMOUNTS OF DATA

The February 2019 article in Nature Medicine reported on a program that used patient information such as symptoms, history, and lab results to diagnose common childhood diseases. According to the article, the system was given data on nearly 600,000 patients at a pediatric hospital in China. The results produced by the system were highly accurate.

In another February 2019 article, Cade Metz reported that Google is developing and testing systems that analyze electronic health records in an effort to flag medical conditions such as osteoporosis or diabetes. Similar technologies are being developed to detect signs of illness and disease just based on X-rays, M.R.I.s and retina scans. The main thing these innovations have in common is their reliance on neural networks. This is a breed of artificial intelligence that learns tasks largely on its own by analyzing vast amounts of data.

Computers can be programmed to recognize patterns amongst vast amounts of data. These patterns can be linked to specific conditions. These are patterns that would be difficult, if not impossible, for a person to notice. Huge amounts of data from medical imaging are fed into artificial neural networks. The program follows an algorithm. The computer then proceeds to learn on the job so to speak. The more data it receives, the better it becomes at interpreting the data.

This learning process is already being used in many applications. Computers learn to understand speech and identify objects this way. Self-driving cars can recognize stop signs. It can tell the difference between a pedestrian and a telephone pole.  Google has created a program to help pathologists read microscope slides to diagnose things like cancer.

 

Mobile devices are the key to tapping into knowledge flow streams.

KNOWLEDGE ACCESS

ON

ANY DEVICE ANYWHERE

The fact that everything is accessible on any device anywhere means patients can get medical help at the hospital, at the ambulatory center, and in the comfort of their own home. In the past, if you wanted to see the doctor, you’d physically have to travel to where the doctor practiced medicine and visit the doctor’s office or go to the emergency room.

Now, much of that care can appropriately be pushed into the patient’s home.

 

Telehealth is the distribution of health-related services and information via electronic information and telecommunication technologies. It allows long-distance patient and clinician contact, care, advice, reminders, education, intervention, monitoring, and remote admissions

Hospital at Home, a program at Mount Sinai, enables video visits. You can check-in, access monitoring tools, and input your vital statistics. Patients can do things like check their pulse, blood pressure, or weight. The information can then be sent to the patient’s care team for review and response.

In a May 10, 2019, Harvard Business Review article, Albert Siu and Linda V. DeCherrie report that “research has shown varying but clearly positive impacts on mortality, clinical outcomes, readmission rates, and cost. A 2012 meta-analysis of 61 randomized, controlled trials, for instance, found that the hospital-at-home patients had a 19% lower six-month mortality rate compared to hospitalized patients. Our research finds that patients who receive hospital-at-home care have fewer complications and readmissions; they also rate their health care experience more highly.”

Bruce Darrow, MD, Ph.D. and Chief Medical Information Officer at Mount Sinai in New York.

Bruce Darrow, M.D., Ph.D., cardiologist and Chief Medical Information Officer for Mount Sinai Health Systems says, “It’s empowering for the patient and it’s good for the clinicians too. The technology allows doctors to let the patients do the jobs they would want to do themselves.  Artificial Intelligence is going to be essential to healthcare. When we think about doing the work with patients at growing population levels effectively, A.I. technology is going to play an important role. If I’m a primary care doctor who is taking care of 2,500 patients, only 20 or 30 of those patients will come into my office on any given day. At the same time, there may be several at home who are at risk. Rather than combing through the entire list of 2,500 patients, if I have tools to look at the prior history of the patient along with their current vital signs, I can determine who I need to see first.”

 

Medical record systems are notorious for not communicating with one another.

Darrow goes on to say, “Electronic medical records have been challenging to connect to one another because of the way they were born. The original idea was not to generate a national patient identity that would allow the same patient to be identified as such from one system to another. There was no original standard for what the medical records would do and how they would interoperate with each other.

The government and the healthcare industry have recognized the problem. That’s where the work of the next few years will be. We’re making progress. At this point, I have patients who come to see me in the office. I can pull their information from a number of systems throughout  the New York area as well as nationwide.”

Telehealth

Telemedicine is the practice of caring for patients remotely when the provider and patient are not physically present with each other. This HIPPA compliant video technology enables clinicians to consult with their patients effectively. Patients can follow-up with their doctor through a video visit instead of making the trip to the hospital or clinician’s office. Patients can get an on-demand video visit with emergency trained doctors. A doctor can have virtual communication with a specialist. Or a stroke specialist can be transported in to participate in the care of an emergency room patient. All of these things are possible today.

 

The Main Benefit of VDI
VDI Planning: 4 Key Pitfalls to Avoid
What is VDI?

Virtual Desktop Infrastructure (VDI) enables virtualized desktops hosted on remote servers on the Internet.  Reducing the need for hardware while improving flexibility, VDI offers practical benefits as well as a hefty return on investment. There is a strong business case to be made. According to the IDC, “The Business Value of VMware Horizon,” of January 2016, there is a 5-year return-on-investment of 413 percent. On average, the virtualized desktop costs 71 percent less to buy, deploy, support, maintain, and use over a 5-year period. This is on a per-device basis. Users spend 76 percent less time on device application log-ins. VDI enables companies to make full use of human capital while preventing many IT-related issues. We need all the help we can get to unlock the massive human assets such as talent, empathy, and creativity. You know, the things computers aren’t that good at. There are indeed great advantages to moving to a DaaS environment. There are also many opportunities for making mistakes along the way. Let’s take a look at the 4 most common pitfalls associated with VDI migration.

A TechRepublic article cites a lack of planning as a major pitfall of VDI integration.  The article went on to report that companies failed to plan for enough resources. Don’t provision for today or tomorrow. Design an infrastructure that will serve your needs next year and for the years ahead. That article was from 2013. It is just as relevant today.

Decide what are the priorities in your VDI environment.

The problem with most VDI implementation is lack of planning. Internal stakeholders should begin with a comprehensive assessment of the IT environment. Also, consider the individual desktop environment. The VDI landscape has changed over the years. Planning and project management are the key to a successful VDI adoption. The initial steps start with an internal dialogue. It’s a good idea to bring in outside expert advice early in the process. Each company is unique. There are different demands and different expectations. The time and effort put into VDI planning will pay incredible dividends for years.

Here are a few of the most common hurdles. They can be overcome when identified early.

VDI Planning
A Common problem with VDI planning is wanting to include everything.
Don’t Try to Do Everything at Once

The first common issue in rolling out a VDI initiative is trying to do too much at once. This applies to both large and small environments alike. VDI does not look the same at any two companies.

Don’t try to include every attractive feature in your initial implementation. Be focused on meeting key objectives. And be selective. Understand the major features and benefits of VDI. But don’t try to include everything in the beginning. This will only slow down the process. It will also distract you from your key objectives. A white paper by VMware recommends taking a step back. Consider what you’re trying to do. Do this before you even think about IT requirements. Instead of diving straight into technical requirements, such as numbers of servers and sizing of WAN links, begin by exploring user needs, business drivers, and special requirements. These special requirements might include things like: compliance issues; high availability; disaster recovery plans, or even the need for the business to rapidly onboard large numbers of new users due to mergers or acquisitions.

Don’t get stuck on the age-old VDI question. For example, using non-persistent versus persistent desktops in their initial deployment.

A company may never deliver a useable VDI solution if they allow themselves to get stuck on an idea. Let’s say that you determine 99% of its VDI desktops will be non-persistent. Well, you need to know that you’re going to spend countless OpEx and CapEx funds.

Stay Focused on Key Points
Zero in on what’s most important to you in a VDI environment.

Narrow down what you need in the planning stage to get VDI in a solid usable state. Set-up your VDI on a set of lean criteria. You can make additions as you go.

Do an Effective Initial Assessment

The next hurdle is company-specific. It is also often overlooked due to the upfront cost and time. I am referring to the VDI assessment that should be a part of the planning. The VDI assessment is the discovery phase of the project. It will help you isolate and focus on what is most important for your business.

Identify who will be using the VDI solution. The assessment is two parts: discussion and analysis. Be sure the process includes all the stakeholders including those who will be using the virtual desktops. Getting them involved early in the design process will help manage expectations. It will also go a long way in nurturing the acceptance of the resulting VDI environment.

Bring All the Brains to the Table
Bringing all the brains to the table will ensure the existing infrastructure is understood and all solution options are on the table.

Let’s use the example of an HR group that will be using VDI during the initial deployment. There is an initial interview. The agenda includes setting expectations of VDI. Begin by looking at how the company currently uses the computer environment.

Discussions along these lines will establish some parameters.
Do they generally only use a combined set of 4 applications? Do they work at varied times throughout the day? Do they only need a web browser and the ability to email clients on the company network?

You also need to do some data gathering of what traditional desktops are doing during the day. What are the applications used? What is needed for the machines to operate?

Most PCs are oversized with wasted resources. VDI is all about compute and storage density. Determining accurate sizing needs equals more cost savings. There are several tools that can do the 2nd part of this equation but don’t overlook the first.

Don’t Overlook Management and Support Responsibilities
This third point is around IT staff.

Who will be managing the new environment once the consultants have departed? Will you share this duty between existing desktop/infrastructure teams? Or will a new team arise to manage the entire solution? Decide this early on.

Manage a VDI environment requires an engineer who understands several key technologies. They sound know how these technologies affect the virtual desktop. These technologies include but are not limited to:

Networking  
Know how users connect to the virtual desktop. Know where to troubleshoot problems like lost connections or poor performance

Compute/Infrastructure
Deep understanding of hypervisors and server infrastructure, depending on the vendor of choice

Security
Knowledge of security products will be inside the virtual desktops and in the network path of VD. This is for troubleshooting purposes.

Desktop Engineering
Basic knowledge for customizing Windows installations and troubleshooting.

Additionally, there are several other ancillary technologies that come in handy. These technologies include DNS, Active Directory, Application Packaging/Delivery, Load Balancing, and Storage.

These skills can come from various class training offerings. Many should come from experience. Knowing how all these different technologies work together in your environment is critical.

Larger companies own many of these technologies.
Separate teams manage them. It is crucial that all the stakeholders be aware of the impact of VDI.

Know who has ownership of the new VDI systems. Make sure there is buy-in from across your IT organization. This is important to establish in the beginning. Everyone needs to be on the same page. This will make training easier. can occur for those needing to ramp up.

This ownership and buy-in include first-line defenders like your typical service desk team. Let them know they’re responsible to field certain common VDI related issues as they come in. Provide education and resources to support them. Service and support is the key benefit of partnering with seasoned VDI consultants.

Don’t Forget the User Experience

As VDI deployment comes together, don’t forget about the user experience.

The User Experience Is Important
User experience is the final litmus test. How the user feels about the experience means the success or failure of VDI or DaaS.

Consider how things were before VDI. Chances are, your employees have been using similar pieces of hardware. They know how their workstation machines perform every day (good or bad). They’ll compare the new VDI environment to what they had before.

This goes back to the assessment stage. Understanding the proper-sizing and performance of each machine is important. It can mean the difference between successful adoption and one that isn’t. It’s also more than that.

If a user now has to login twice to access their Virtual Desktop they will complain. If the machine hangs when opening a video conference they will complain. If patches cause reboots on different days, they will complain. You want to make the change over to VDI as seamless as possible.

The experience should be better, not equal or worse than on a traditional desktop. Make sure you plan to provide the expected performance of each workstation. Allow for a tailored storage solution that is intelligent and optimized for VDI. Consider network crashes. If for whatever reason, they can’t access their virtual desktops, this can also be a problem. Here’s the point. Outside factors can contribute to the total experience on a Virtual Desktop. Many of these factors will be beyond your control.

The successful adoption of VDI means user acceptance. Deliver a desktop-like experience. It means proving the training and support necessary. Company-wide buy-in is key to the success of the whole program. It all begins with planning and making sure you have every brain at the table when that happens.

Ransomware Targets Healthcare
The Healthcare Ransomware Epidemic: How to Protect Your Patients
The Problem is Becoming a Crisis

Data breaches are happening at an alarming rate. In fact, the threat of ransomware attacks has become elevated to crisis levels. While there’s increased awareness, attacks are becoming more sophisticated. A variety of large and small organizations are being attacked. No one is immune. The healthcare industry has been and continues to be, prime targets. And for good reason. Healthcare organizations are considered low-hanging fruit by cybercriminals. Hackers know healthcare centers are notorious for having inefficient security. Most hospitals don’t have procedures in place to restore a network once locked by ransomware. Most applications in Hospitals have little or no network segmentation. There are no firewalls between workloads. Basic security protocols are not in place.

Besides the alarming ransomware statistics, there are some attacks that never get reported. The U.S. Department of Health and Human Services experienced 52 data breaches in October. Last year, hackers stole over 38 million medical records. These sobering statistics have made the healthcare industry take notice. Many healthcare organizations are taking steps to increase cybersecurity. But more can be done. This article will take a look at some of the more recent ransomware cases. We’ll look at some mistakes that were made in dealing with cyberattacks. And we’ll offer ways to improve cybersecurity and protect patient data moving forward.

The consequences of a data breach reach far beyond the breaking news story. There’s more to it than the short news article that appears on your computer screen. A single attack can close down an organization for good. It can happen in a few minutes. The consequences can have long-lasting implications. This is particularly true for the healthcare industry. Sure, the reputation of the healthcare center gets flushed down the toilet, but there’s a real impact on the patients. These incidences are not merely expensive inconveniences. Cyberattacks disrupt the entire eco-system of the institution. It puts people’s health, safety, and lives at risk.

 

Healthcare Worker Distressed by Ransomware Locking up IT systems
Security breaches will cost healthcare organizations $6,000,000,000 this year.

 

Often, the healthcare center gets victimized twice. First, there is a ransomware attack. Second, the healthcare system becomes the target of a class-action lawsuit from a community of angry patients and their families.

Consider the New Scientist article about the 2016 attack on the Hollywood Presbyterian Medical Center. It was a Friday afternoon when malware infected the institution’s computers. The attack seized patient data and prevented the staff from further communication. The date was February 5. The same day computer hackers tried to steal 1 billion from the Federal Reserve Bank of New York. It all happened in a matter of seconds. Medical records had to be kept by using pen and paper. They used old fax machines. Patients were sent to other hospitals, operations canceled. The medical center was back on-line after a 2-week standoff. But not until after paying a ransom of 50 bitcoins (the equivalent of $17,000 at the time).

Malware can infect the entire computer system. Someone clicks on a link to a booby-trapped website or opens an attachment in a phishing email. Immediately, malicious malware gets to work encrypting the files. Some malware can immobilize entire IT infrastructures. If data is backed up and you get an attack of malware or something, you can always go back to yesterday’s data.
Healthcare targets often have their backs against the wall during a cyberattack. Because they don’t have their files backed up.

In most cases, a ransom is paid. The hackers deliver the decryption key. And medical centers are able to decrypt the seized files. The Hollywood Presbyterian Medical Center was straight forward. They handled the crisis as best they could. See the above comments about using pen and paper. They negotiated a lower ransom and their data was returned. More recent victims haven’t been so lucky.

Medical malpractice has been part of the healthcare landscape since the 1960s. Now there is an additional risk of medical malpractice during ransomware attacks. If the ransomware attack affects the patient in any way, there will be repercussions.

Doctor Using Tablet
While only a few healthcare systems have policies around using mobile devices, there is a growing movement to regulate such devices.

Take the cyberattack on LifeBridge Health systems. Seven months after the incident, the Baltimore-based health system faced another problem. A class-action lawsuit was filed against them. The lawsuit claimed negligence on the part of the medical center. It also accused LifeBridge of waiting 2 months before informing the affected patients.

LifeBridge had to respond to the allegations. The organization contracted a national computer forensic team to investigate the attack. Patients were offered credit monitoring and identity protection services.

Clearly there are basic mistakes made that contribute to breaches. Mistakes can allow the infiltration to happen in the first place. Resolving a ransomware situation is stressful. People can do things that t make the situation worse.

Ransomware Recovery Mistakes

Health Management Concepts in Florida was attacked with ransomware. The official report was made on August 23. HMC learned about the incident on July 16. The ransom was paid. The attackers delivered the decryption keys. The hospital IT administration immediately took steps to decrypt the data. To their horror, the HMC staff realized they made the problem worse. They accidentally sent files containing patient information to the hackers.

UnityPoint Healthcare had the misfortune of suffering two security breaches in 2018. The second attack compromised the data of 1.4 million patients. At least, that’s the official tally. A series of phishing emails had been made to look like they were from a top executive within the company. An employee fell for the scam. It gave hackers the opportunity needed to penetrate the entire system.

The protection of healthcare assets is not just a matter of protecting patient information but protecting the patients themselves.
Recognizing the Risk is the First Step Toward Protecting Patient Information

The onslaught of cyberattacks against healthcare is relentless. There are inspiring stories of medical centers fighting back. They’re defending themselves against nefarious cyberattacks. They’re saving lots of money. Increasing their efficiency. And better protecting their patients.

One such story belongs to the Interfaith Medical Center of Brooklyn, New York. It’s a 287-bed non-profit teaching hospital that treats more than 250,000 patients every year. They were able to avoid malware outbreaks. Their proactive approach enabled them to detect and respond immediately to advancing threats. Their strategy involved an assessment of threats and implementation of policies and procedures.

Incident response time is critical. Measure it with a stopwatch, not a calendar. All the segmentation in the world isn’t any good if the door won’t be closed in time. Their program was successful. It identified malware infections long before they had a chance to become a problem. They were even able to identify a malware-infected medical device after it came back from a repair vendor.

The Interfaith Medical Center anticipated a ransomware attack and took steps to prepare for it. In a September 3, 2019, Healthcare IT News article, we learn how Christopher Frenz – the VP of Information Security protected the non-profit’s IT system. “One of the ways I approached this was simulating a mass malware outbreak within the hospital, using a custom-developed script and the EICAR test string. Running the script attempted to copy and execute the EICAR test string on each PC within the organization to simulate the lateral movement of a threat within the hospital. Exercises like these are great because they help an organization identify what security controls are effective, which controls are ineffective or in need of improvement, how well or not the staff response to an incident will be, and if there are any deficiencies in the organization’s incident response plan,” he explained.

Christopher Frenz, Interfaith Medical Center's VP of Information Security
Christopher Frenz, VP or Information Security at Interfaith Medical Center, led the charge with his zero trust architecture that protected the network from cyberattacks and saved the healthcare system millions of dollars.
“We have successfully avoided malware outbreaks and are actively detecting and responding to advanced threats, long before they impact privacy or operations.”

Christopher Frenz, Interfaith Medical Center

 

The article ends with some excellent advice from Frenz. “Healthcare needs to begin to focus on more than just compliance alone, as it is far too easy to achieve a state where an organization meets compliance requirements but is still woefully insecure. Organizations need to put their security to the test. Pick solutions that can empirically be shown to improve their security posture.”

 

There are basic steps healthcare organizations can take to minimize their risk of ransomware attacks. Learn as much as you can about ransomware attacks. Consider all possible points of entry. Where is your IT system vulnerable? Medical software used for patient data has numerous vulnerabilities. Healthcare cybersecurity statistics by Kaspersky Security Bulletin found easy access to 1500 devices used by healthcare professionals to process patient images such as X-rays.

 

Improving the cybersecurity of a healthcare organization, whether large or small, has two parts. One part has to do with the design and implementation of the IT system entire (i.e. whether-or-not there’s back-up and disaster recovery features in place). The other part has to do with your human capital.

 

Malware can be introduced from any number of locations along with your network. Often the attack is designed with multiple points of entry. It could be phishing emails where an employee is tricked into clicking on something that is booby-trapped. It could be a bogus email from what looks like an upper-level executive but is actually from a hacker.

 

ON-GOING EDUCATION AND REFRESHER COURSES
Healthcare Employees Being Educated on Cyber Security Procedures
Healthcare employees should have regular and comprehensive cyber threat education. This enables them to avoid falling into traps that can trigger ransomware. It also serves to establish a strong security culture.

Human beings make mistakes. This is especially true in the busy high-stress environments of hospitals. Or in situations where doctors, nurses, and orderlies work extended 10 to 12-hour shifts. People have to be educated about the risks of cyberattacks and what forms such attacks might take. It’s easy for a rushed employee, at the tail-end of their shift, to unknowingly click a file, download an unauthorized software, or be tricked into loading a contaminated thumb drive. There are basic security processes that should be implemented. These are things like creating strong passwords and changing them at regular intervals. Duel factor protection is also a good idea.

Cybercrooks study the vulnerability of humans. Hackers continually figure out ways to exploit human traits and their gullibility. Through social engineering tactics, cyber attackers design pathways to plant ransomware or get a foothold in an information system.

 

SECURITY IS NOT ABOUT QUICK FIXES

Take the time to ensure the staff and vendors are mindful of what they’re doing. Review policies and procedures regarding handling patient data. Review how to avoid security incidences. As we have seen, any data breach has legal ramifications. There needs to be a systematic response that is carefully considered and forged into a process. Additionally, partner with the right vendor who can design and provide a holistic security solution that will protect your patients.

What is the Cloud?

How many of us really know what the cloud is? Oh sure, we know that the cloud involves storing and accessing stuff via the Internet, but do we understand the powerful transformational nature of cloud computing technology. Do we appreciate how it has changed and continues to change, the way we live and work?

Not that long ago if you mentioned the cloud, most people thought you were talking about the weather. As recently as 2012, Wakefield Research discovered that 51% of the people surveyed, most of whom were Millennials, thought that storm conditions could interfere with cloud computing. Later that same year, Business Insider reported only 16% understood the cloud to be a network of Internet-connected devices to store, access, and share data. So if you don’t know that much about the cloud, don’t feel bad. You’re not alone.

Most people, if they think of the cloud at all, know it simply as a place to keep iTunes, archive our favorite movies, or family pictures and videos. Consumers know the cloud as a storage service offered by Apple. Our knowledge of iCloud is usually associated with the company’s invitation to add more space. Then there’s Netflix. Millions of people access feature-length movie titles stored and delivered on-demand via cloud technology. Do you store and share large files via DropBox? Does your office use Microsoft Office 365?

This article won’t be describing the cloud per se. Nor will it attempt to explain the various types and configurations of clouds. But rather a high overview of how cloud technology transforms companies and whole industries. It will explore the way cloud technology changes the way we work with each other all over the world. Technology growth is accelerating at multiplying rates. This acceleration is due to all the technologies blending together into the cloud.

 

The Supernova
The Cloud is a Supernova

 

We use a soft fluffy metaphor like the cloud, but “the cloud” paints a misleading picture in our minds. The Pulitzer Prize-winning writer Thomas L. Friedman, in his book, THANK YOU FOR BEING LATE, prefers to call the cloud “the supernova.” A term originated by Mircosoft computer designer Craig Mundie. Why refer to it as “the supernova” and not “the cloud.” In the world of astronomy, a supernova is the explosion of a star. It’s a huge astrological event; in fact, the largest explosion that takes place in space.

So too, the cloud is an incredible release of energy. The energy reshapes every man-made system that our society has built. And now, every single person on the planet who can access the Internet can tap into its power. The only difference, Mundie points out, is that a star’s supernova only happens once. The computer supernova keeps releasing energy at an accelerating rate. It’s interesting to note that the components that make up the cloud continue to drive down in cost. The cost goes down while performance keeps going up.

Just as the discovery of fire was a game-changer back in the Stone Age, and Electricity lit the way from one century to the next in the late 19th Century, the cloud has fundamentally changed the modern world. There are more mobile devices on the planet than there are people. Soon everyone on the planet will be connected.

Go with the Flow

The cloud has large amounts of digital information moving in every direction. The information travels up and down. The white-water rapid current moves fast and with equal energy. You have to learn to go with the flow if you’re going to thrive. Like maintaining constant homeostasis, you have to go with the flow to keep your balance. You’ll be better equipped to look ahead, predict trends, and respond to the ever-changing market.

The Flow of Knowledge Stocks

In the past, the traditional idea was to go to college. Get an education. Find a job where you can apply that education. Show up. Do the work and you’d be fine. You’d be set for life. The focus was on one person having a stock of knowledge. Today, the focus has shifted to the flow of knowledge. As pointed out in the 2009 Harvard Business Review article “Abandon Stocks, Embrace Flows,” it’s no longer about having knowledge.

As the world accelerates knowledge tends to become outdated at a faster rate. The premium shifts to a focus on updating knowledge. Choice marketable characteristics will be a high level of curiosity, and staying in touch and maintaining the pulse on the latest advancements. As the world accelerates, stocks of knowledge depreciate at a faster rate. This is true for items you buy as well. Notice how quickly product life cycles have compressed. Even the most successful products fall by the wayside quicker than before. We have to continually learn by participating in relevant flows of new knowledge. And it’s not just a matter of diving into the flow when we feel like it. Participation and benefiting from this flow of knowledge requires that we must also contribute to it on an on-going basis.

This is the world of the cloud. This is where workspaces connect globally. Ideas and knowledge are exchanged freely. The so-called little guy can compete with the big guy. In the March 2016 study “Digital Globalization: The New Era of Global Flows” by the McKinsey Global Institute, we see in great detail how the world is more interconnected than ever.

Many enterprise companies are taking advantage of this interconnectivity. They’re leveraging the technology in order to take advantage of the knowledge flows moving around the planet. For example, Friedman describes in his book THANK YOU FOR BEING LATE, how General Electric supplements its internal resources of engineers to run global contests to see who can come up with the best design solutions. One such contest received 697 entries from companies and individuals all over the world.

It’s All About Interconnectivity

This interconnectivity is expanding “instantaneous exchanges of virtual goods.” The cloud enables digitized financial flows to happen at unfathomable rates. The science journal Nature published “Physics in Finance: Trading at the Speed of Light.” It presents an industry driven by ever-increasing speed and complexity. The article reports that more than 100,000 trades occur in less than a second. That’s for a single customer.

High-frequency trading relies on several things. It needs fast computer algorithms for deciding what and when to buy and sel. Live feeds of financial data are needed. And high-frequency trading also requires about $15,000 a month to rent fast links.

Moving faster also increases the likelihood of mistakes. In 2012, a flaw in the algorithms of KNIGHT CAPITAL – one of the largest U.S. high-frequency firms, caused a loss of $440 million in 45 minutes. The algorithm accidentally bought at a higher price than it sold.

Data speedbumps act like traffic cops slowing down the flow of traffic.

Some trading firms established a way to keep the traffic from moving too fast. They introduced a kind of digital speed bump. Slowing down the flows of digital traffic by 350 microseconds. Apparently this was all time traders needed to benefit from faster feeds. The inclusion of a speed bump, all 350 microseconds worth, meant we’ve already surpassed the optimal speed for trading.

Speed & Complexity Are Free

Because information moves much faster now, global markets become more interdependent on each other. Remember when China made some financial missteps in 2015. It caused a ripple effect that stretched across the planet. Americans felt it immediately. On August 26, 2015, CNN.com reported:

“The American stock market has surrendered a stunning $2.1 trillion of value in just the last 6 days of market chaos. The enormous losses reflect deep fears gripping markets about how the world economy will fare amid a deepening economic downturn in China. The Dow, S&P 500, and Nasdaq have all tumbled into correction territory. It is their first 10% decline from a recent high since 2011. The dramatic retreat on Wall Street has been fueled by serious concerns about the fallout of China’s economic slowdown.”

PayPal has become one of the most important drivers of digital finance. The company set out to democratize financial services by enabling every citizen to move and manage money. The explosion of smartphones gave users all the power of a bank branch at their fingertips. The incremental cost of adding a customer is nearly zero. What is common-place for Americans to do, send money to someone, pay a bill, or get a loan, was now simple, easy, and nearly free for 3 billion people around the world. These were the people who would have to stand in hours to change their currency and stand in another line for hours to pay a bill. PayPal doesn’t rely on FICO scores the way a traditional bank or credit card company does. Instead, they use their own big data analytics based on your actual transaction activity on their site. This gives them a more accurate picture of your creditworthiness. The result: instant loans to more people around the world with a higher rate of payback. PayPal is one of the companies eliminating the need for cash. PayPal is also experimenting with “blockchain” for validating and relaying global transactions through multiple computers.

Cloud technology has brought with it a period of adjustment. We need time to absorb, learn, and get used to the idea of working differently. The cloud will make economies measurably more productive. Because of it Individuals, groups, and organizations are now on a level playing field. These individuals, groups, and organizations can shape the world around them in unprecedented ways. And they can do it with less effort.

Leverage & Synergy

There has never been a better time to become a maker, an inventor, a start-upper or an innovator. It’s leverage and synergy in action as never before.

Leveraging Technology

 

Consider some of these examples:

Uber

The world’s largest taxi company owns no taxis

FaceBook

The most popular media owner creates no media

Alibaba

The world’s most valuable retailer has no inventory

Airbnb

The largest accommodation provider owns no real estate

THE DOUBLE-EDGED SWORD

Technology has always been an amplifier of the best and worst of humanity. It tends to magnify our psychological and spiritual condition both good and bad. Cloud technology is a double-edged sword. On one hand, it empowers the individual, groups, and organizations as never before. Companies communicate faster and more fluidly. Small boutique shops can become multi-national enterprises in a short amount of time. More brains are connected globally. The smallest voices can be heard everywhere for the first time.

Alternately, technology can be used to belittle and disempower. Just as the cloud enables builders and makers, it also gives power to breakers. One person can do more damage more cheaply and more easily. Take Navinder Singh Sarao for example. Sarao, operating from one computer on a network connection out of his parent’s house in West London, single-handedly manipulated the U.S. Stock Market into losing a trillion dollars in less than a half-hour. He “spoofed” the Chicago Mercantile Exchange into setting off a terrible chain reaction. Spoofing is an illegal technique of flooding the market with bogus buy and sell orders so that other traders, both human and machine, are fooled into helping the perpetrator buy low or sell high. He had developed his algorithms to alter how his orders would be perceived by other computers.

Big forces can come out of nowhere and crush your business. You’ll never see them coming. The mobile broadband-supernova is a double-edged sword. How it’s used depends on the values and tools we want to put into place.

WE BECOME WHAT WE BEHOLD
We shape our tools and then our tools shape us.

In summation, the cloud, our technological broadband-supernova, is here to stay. It won’t be the same cloud a few months from now, but it’s here to stay. And things will continue to accelerate. It’s going to be difficult for many to keep up. Keeping up may be one of the great challenges facing society in the decades to come.

In answering the question, “Why is the world changing so fast?” Dr. Eric C. Leuthardt states in his “Brains and Machines” blog:

The reason for accelerating change is similar to why networked computers are so powerful. The more processing cores you add, the faster any given function occurs. Similarly, the more integrated that humans are able to exchange ideas the more rapidly they’ll be able to accomplish novel insights.

Different from Moore’s Law, which involves the compiling of logic units to perform more rapid analytic functions, increased communication is the compiling of creative units (i.e. humans) to perform every more creative task.

A great primer for anyone interested in understanding the transformational power of cloud technology is Thomas L. Freidman’s 2016 book THANK YOU FOR BEING LATE: AN OPTIMIST’S GUIDE TO THRIVING IN THE AGE OF ACCELERATIONS.

Ransomware Risk Mitigation: The Desktop-as-a-Service Solution

Ransomware is a dangerous and growing threat. Find out how security-minded executives establish best-in-class protection.

2019 has proven to be an alarming year for cybersecurity professionals and cyber-attacks show no signs of slowing down in 2020.

One cybersecurity firm characterized the rapidly growing pace of cyberthreats across all industries as an “unprecedented and unrelenting barrage”. Within 24 hours of its report, the City of New Orleans and several other municipal organizations fell victim to ransomware attacks.

But it’s not just large-scale enterprises and public institutions that are under attack. Small and mid-sized businesses offer low-hanging fruit for opportunistic cyber criminals, who often use automation to widen their area of attack.

Small businesses, large enterprises, and public institutions alike have all struggled to respond decisively to the ransomware threat. Until recently, executives had few options – and fewer defenses – in their fight against cybercrime. Now, Desktop as a Service (DaaS) solutions offer comprehensive, scalable ransomware protection services to organizations of all sizes.

 

What Exactly is Ransomware and How Does It Work?

 

There are a number of ways for a cyber intruder to take over your computer system without your knowledge. You won’t know about it until it’s too late.

The typical ransomware attack begins with the stealthy takeover of the victim’s computer. This may be accomplished through phishing, social engineering, or a sophisticated zero-day exploit – the goal is to have access to the network while remaining undetected.

Upon compromising the network, the cybercriminal can begin slowly encrypting important files. Most ransomware applications do this automatically, using a variety of different methods to evade detection. The process may take days, weeks, or months to complete.

Once the ransomware encryption algorithm reaches critical mass, it then locks users out of the network, displaying a ransom note demanding payment for a decryption key. Sometimes the demand is small – on the order of $500 to $1000 – and sometimes the demand reaches into six-figure sums.

Ransom demands are usually for bitcoins. “If one organization is willing to pay $500,000, the next may be willing to pay $600,000.”

Small sums make paying the ransom a tempting option, but a dangerous one. There is no guarantee that the cyber attacker will relinquish control of the network. Instead, executives who pay up reinforce the cybercriminal profit cycle. It is only a matter of time before the ransomware attacker strikes again.

Famous examples of ransomware variants include WannaCry, which spread to over 230,000 computers across 150 countries in 2017, and Petya. The WannaCry crisis targeted healthcare clinics and hospitals, causing untold damage and highlighted the risk that outdated IT systems represent in these industries.

Petya was unique because it did not encrypt specific files. Instead, it encrypted the local hard drive’s Master File Table, rendering the entire device unusable. There are dozens of other variants out there, and each one uses a unique strategy to take advantage of victims. NotPetya developed on Petya’s attack method, using the same vulnerability previously exploited by WannaCry.

Who Is At Risk of Ransomware Attacks?

 

Emsisoft reports that during the first half of 2019, 491 healthcare providers were hit with ransomware. The attacks are increasing and the demands are for larger ransoms.

Everyone. Although high-profile targets like hospitals and municipal institutions make headlines, thousands of business owners are defrauded every day. On average, one business falls victim to ransomware every 14 seconds.

Small and mid-sized businesses are especially vulnerable because they typically do not have access to the kind of comprehensive security resources that large enterprises can afford. Small businesses that do not rely on reputable third-party managed service providers make especially easy targets.

Cybercriminals have shown that they are willing to target hospitals and public institutions without shame. The greater the need for functioning IT systems is, the more likely the cybercriminals are to get paid. This is how the cybercrime profit cycle perpetuates itself.

What Can Small and Mid-sized Businesses Do About Ransomware?

 

Organizations caught unprepared have few options. Although cybersecurity experts correctly warn against paying the ransom, desperate business owners often pay anyways. But the relief is only temporary. 60% of small and mid-sized businesses victimized by cybercriminals do not recover and shut down within six months.

Preparation is key to successfully resisting a ransomware attack. Organizations that cannot afford to develop, implement, and deploy state-of-the-art security resources need to contract a reputable third-party vendor for the purpose.

Even enterprise-level organizations with tens of thousands of employees often find themselves opting for a managed solution instead of an in-house one. The cybersecurity industry is experiencing a widening talent shortage, making it difficult even for deep-pocketed businesses to hold on to their best security officers.

Introducing IronOrbit: Comprehensive Ransomware Protection

IronOrbit achieves best-in-class ransomware protection through a unique approach to cloud desktop hosting. There are three key processes that must work together flawlessly to guarantee ransomware resilience:

1.   Prevention

The best way to prevent a ransomware attack from taking place is preventing the initial malware deployment. Firewalls, email filters, content filters, and constant patch management all play a critical role in keeping malicious code out of DaaS systems.

Maintaining up-to-date software is more important than most executives and employees realize. Since NotPetya used the same attack vector as WannaCry, its victims entirely consisted of individuals and businesses who neglected to install security patches after the WannaCry crisis.

2.   Recovery

There is no way to guarantee 100% prevention. However, business owners and their IT teams can circumvent the damage ransomware causes with consistent backup and restoration tools. IronOrbit’s disaster recovery features can wind back the clock, reloading your entire suite of business systems to the state they were in just before the attack occurred.

3.   Remediation

Ransomware recovery cannot guarantee business continuity on its own without best-in-class remediation tools. Without the ability to trace the attack to its source in a fully logged environment, there is no way to tell whether the attack has been truly averted or not. IronOrbit uses state-of-the-art digital investigation tools to track ransomware attacks to their source and mitigate them.

Schedule a Consultation with an IronOrbit Security Expert

IronOrbit has helped numerous businesses capitalize on the efficiency and peace of mind that secure DaaS solutions offer. Protect your business from the threat of ransomware with the help of our expertise and knowledge.

 

The Top Cloud Solutions Every AEC Firm Should Be Using in 2020

The quantity and quality of cloud offerings have grown significantly in the last few years. There are a number of new solutions available to AEC firms that are especially worth taking a look at. We’ll look at a few of the top cloud solutions in this article.

The AEC industry faces incredible growth and changes in urbanization and globalization. As traditional data centers become insufficient to meet these demands, so does the demand for security and efficiency. Trends such as hyper-automation, the distributed cloud, and practical blockchain are just some of the trends that will continue to proliferate into 2020. Each one of them has the ability to transform and optimize initiatives.  The AEC sector has done a good job of keeping up to date. More than 2/3 of AEC firms store data in the cloud. The reason: cloud solutions are an important ingredient to maximizing workflow, costs, and sustainability.

Cloud storage, for instance, can be a more cost-effective alternative to spending thousands on upgrading your local IT. Cloud computing has become increasingly popular for a number of reasons: it’s more affordable, able to perform computationally intensive work, workspace flexibility, and more secure.

Also, there is the added advantage that, with cloud-based computing, it’s possible to view and work with complex renderings on an underpowered device.

Scalability is also easier in the cloud. Most providers allow for scalable, on-demand resource usage. This enables your company to have more computing power when it’s needed.

Cloud storage providers enable benefits that are impossible to duplicate. A Cisco report suggested, “By 2021, 94 percent of workloads and compute instances will be processed by cloud data centers.”

Firms that have moved to the cloud have an edge over the competition. With that in mind, let’s take look at what kind of cloud services are available. Let’s also consider how each one benefits an AEC firm.

 

1 – Cloud Storage

It’s less expensive than storing large CAD files on premises to house large data with a cloud storage provider

 

Cloud storage is a great solution because it is simple and offers several important benefits.

First, cloud storage providers backup your data. Cloud storage companies will often have servers in two different parts of the world. One server might be on the West Coast of the United States. The other server might be on the East Coast so that even the largest disaster won’t wipe out your files.

Files stored on the cloud can be accessed from anywhere. Whether your crew is working 10 or 1,000 miles away they’ll have easy access to everything stored online. It’s also simple to set permission levels on various files. For example, an administrator can dictate that renderings can be viewed at the job site but only edited in the office.

 

Everyday examples of cloud storage providers include DropBox, Google Drive, Sharepoint and One Drive.

2- Cloud Storage Gateways

Cloud storage gateways can help to reduce costs in a number of ways. Data compression reduces bandwidth which enables increased storage. These cloud storage gateways can make smart decisions about where to save files. Files that accessed frequently) are called “hot files.” Hot files can be more expensive to store online. A gateway may keep them in local storage while storing more infrequently used files in the cloud.

Panzura

Panzura has a cloud-based version designed to create a shared environment for everyone to work in. With Panzura users can store CAD and BIM files online and open them in a matter of seconds, not minutes. All files can be accessed from any location making collaboration easier. Panzura claims to “reduce infrastructure costs up to 70%,” over traditional data centers.

Like all cloud storage service providers, Panzura makes it easy to share files and collaborate across a variety of devices.

Another of Panzura’s interesting features is the work-sharing monitor. Work-sharing allows for remote viewing of another Panzura user’s workstation. As with other cloud-based solutions, Panzura can scale as your firm grows.

3 – Cloud-Based Accounting and Management Applications

Running your firm’s accounting and management applications in the cloud simplifies workflow. Collaboration is fluid when everyone at the company has access to the files. Accessibility to files while at the office, at home, or at a hotel room on the other side of the country.

Decreased IT and Hardware Costs: You’ll be able to significantly reduce IT and hardware costs when you host your Sage or QuickBooks software off-premise with a cloud hosting provider.

The cloud enables collaborators to work in real-time. Different permission models can be set for different users. It is also easier to collaborate with suppliers, distributors, and contractors. Since the files and data are already online it’s simple to give outside parties access. That’s opposed to localized data which is more difficult to share.

Deltek is a cloud-based solution to track projects. Project steps can be broad or detailed. Deltek tracks billable hours, resource usage, and expenses. If your firm uses different programs for different purposes, it may be time to consolidate.

While localized programs have offered project tracking for years. Deltek makes collaboration easier for the whole team. Everyone from accounting to the drafting team has access to the same program. The functionality is the same whether they’re on a $5,000 workstation or a $200 smartphone.

 

4 – Internet of Things (IoT)

The number of Internet-connected devices is growing at an exponential rate. As microchips and transmitters are becoming more affordable, more applications and innovations are being introduced onto the market. This means more tools for the AEC industry to track and improve efficiency.

The whole Internet of Things (IoT)

Internet-connected GPS devices, for instance, are great at tracking fleet mileage. They can also make recommendations about necessary vehicle maintenance. There is a Bluetooth tag that attaches to a piece of equipment making it easy to locate on a crowded job site. The tag also helps recover lost tools.

 

5 – Hosted Desktops

Hosted desktops transform computers into more powerful workstations without having to purchase expensive PC hardware.

A hosted desktop is ideal for AEC firms running multiple AutoCAD workstations. Scaling is also a breeze as hosted desktops can increase their resources to handle any task.

A hosted desktop transforms cheap laptop or tablet devices into a powerful workstation. The kind of devices that can launch power-hungry programs and model complex drawings. That makes it great for the crew out in the field who don’t normally have access to powerful computers.

 

IronOrbit INFINITY: The All-in-One Solution
The all-in-one solution offered by IronOrbit provides peace of mind, increased agility, and true synergy with key organizational objectives.

While cloud solutions (here are 7 good reasons why AEC firms need the cloud) offer a number of advantages to AEC firms, not all products are created equal. Products like INFINITY offers tremendous functionality and flexibility.

INFINITY is a convenient cloud-based workstation. It combines the best features of cloud-based solutions into one place. This includes:

  • Hosted desktops
  • Cloud storage (including Panzura integration)
  • Application hosting (any application, including accounting and ERP software)
  • Unlimited computing, upgrades, and bandwidth
  • Managed backups and disaster recovery
  • Managed security and compliance
  • 24/7 US-based IT support

INFINITY is a workspace that allows access to CPU and graphics-intensive applications from anywhere.

With unlimited CPU and RAM upgrades you never have to be concerned that you’ll run out of processing power and collaboration is easy when your whole team is working in a centralized environment.

Centralization allows the team to work with the same version software. The whole team uses the same application and work from the same set of files. There’s no longer the concern that employees are accessing different versions.

Inconsistent files can cause delays, accidents, compliance violations and more. Having the team working from the same set of files is a considerable benefit.

Then there’s the convenience of dealing with one vendor. Forget about tracking costs from multiple service providers. There’s no dealing with half a dozen account managers and support teams. IronOrbit gives you everything in one package so that you’ll only ever need to work with a single company.

An Easy-To-Manage and Future-Proof Solution, Too

Speaking of the all-in-one package, IronOrbit wants to make cloud management as simple as possible. They take care of setup, backups, application security as well as user support.

Other cloud solutions require the client to do a number of things. They have to set up the service. Integrate the service with existing infrastructure, and manage it on an ongoing basis. With IronOrbit these time-consuming processes are gone.

When considering a cloud service provider, it’s important to think about the future. IronOrbit is backed by today’s cutting-edge technology. It’s flexible enough to support any future technology growth.

IronOrbit is constantly upgrading its infrastructure. Their clients can take advantage of unlimited GPU and RAM; unlimited bandwidth, and unlimited upgrades to the latest versions of Windows and Microsoft Office.

Whatever devices and operating systems the future holds, IronOrbit INFINITY will be ready to support them and give you the same level of service that you’d expect from a professional cloud-solution package. With INFINITY’s centralized, all-inclusive and convenient software you’re always in good hands.

In Thomas Friedman’s book THANK YOU FOR BEING LATE, he refers to the exponential acceleration in technology based on Moore’s Law: the speed and power of microchips double every 24 months.

In this age of acceleration, companies have 3 major objectives. They want to do more. They want to move faster. They want to do it at less cost.

To achieve all 3 objectives, a company has to ensure one thing. That there is alignment between their business objectives and their IT capabilities. Harmonizing clear objectives with IT capabilities, companies will realize their full potential. Not all companies are able to follow that path on their own. They may know they need to change, but don’t know how to get there.

If they do know how to get there, they don’t know who will manage the new environments. It’s more common now that company leadership won’t have a clue of what their future state should look like. Our primary job is to listen to the customer. We need to understand our client’s objectives and future-state desires. Then we analyze the requirements against IT capabilities. It has to be a solid yet flexible enough infrastructure that will support the future business goals. To the degree this harmony between goals and technology is in place, the intended transformations will happen.

It’s Time for an Upgrade

In the last couple of years, there have been a lot of exciting developments in the AEC-cloud-industry. From time and maintenance tracking to hosted desktops allowing users to render complex projects on their iPhones, the cloud has much to offer.

Not to mention savings, when you factor in the thousands or tens of thousands of dollars you can save by not having to continuously upgrade your workstations.

If you haven’t looked into getting on the cloud yet there has never been a better time. Cloud computing and project management are the directions the AEC industry is heading toward. No firm is going to want to play catch up to its competitors.

What is the True Cost & Benefit of Moving to the Cloud

Moving to the cloud should be more of a business decision than an IT decision. Cloud servers are a keystone of modern business technology. Once you consider moving to the cloud as an initiative to make full use of new technology, you begin to envision the kind of agility, stability, and responsiveness the cloud enables down the road. It’s also a solid first step in future-proofing your business. This perspective demands a view on ROI that moves beyond calculating dollars and cents.

 

Calculating ROI
Calculating the ROI of your technology investment doesn’t have to be rocket science, but remember what Einstein once said, “Not everything that counts can be counted.”

Looking beyond spreadsheets and calculations means considering how your technology helps you meet your strategic objectives.  Long-term success depends on a proactive agenda of workforce transformation, strategic flexibility, security, and manageability.  Are your technology investments driving productivity for your business? Are they solving challenges or creating more problems? Answers to questions like these are the main reasons why many companies are moving to the cloud.

 

Forrester released a report in early 2019 that stressed the importance of corporate leaders to gain more fluency in the technology choices made. They need to understand the different performance yields of different innovation efforts. It’s important to be visionary about where the company is headed during the years to come. Know what is at stake should you keep your IT infrastructure on-prem or move it to the cloud. Become focused on how to make business technology a basis of a durable strategic advantage.

Board Meeting
While corporate leaders need not be able to use devices, programs, and apps, they should know enough about them to discuss them intelligently with the team.

In a more recent podcast, Forrester gives its top predictions in IoT, AI, and cloud computing.

About half the big enterprise outfits that try to transform their systems fail or stall under the sheer size, and complexity of the process. Certainly, a large part of the problem has its origins in the failure to design a strategic plan that works. Don’t put the cart before the horse. Remember the carpenter’s rule, “measure twice, cut once.” You’ll avoid costly mistakes, both in terms of time and money, if you do research and get as much information as possible before you start spending resources on cloud migration.

ADVICE FROM EXPERTS 

Every organization has its own unique strategic needs. Not all businesses have the same priorities. There is no one-size-fits-all approach to developing a strategy or plan to move to the cloud. Any significant technological transformation requires analyses and consultation with experts in the field. It also helps if these experts know as much as possible about your business goals.

The first step is to become clear-eyed on the business strategy.  Evaluate business objectives and assess how your existing technologies align with meeting those plans. Inevitably gaps will become apparent.

Utilize the insights from the best technology consultants you can find. They’ll be able to recommend available options and optimal routes. In some cases, there may not be an immediately available option that best suits your objective. In those situations, something more innovative and customized to specific needs may be needed. This is exactly why a good advisor is critical to successful cloud migration. A good advisor will be a true IT professional, one who stays abreast of the latest technologies, but also one who has a comprehensive understanding of business operations. Having this kind of resource on hand can mean all the difference between a successful transformation or one that goes off the rails. Failed attempts are costly with absolutely no ROI.

While it’s true that every company is unique and each one has its own set of priorities for future growth and productivity, there are a few technology industry trends that can serve as a guiding light.

THE INCREDIBLE EVER-CHANGING WORKFORCE

This isn’t your grandfather’s workplace environment anymore. It’s not even your father’s workplace environment.  For people to become fully engaged and productive, they need flexibility over the tools they use. The choice of places to work would be nice too. Employees need reliable and secure access to the resources they use and depend on.  Consistency of experience shouldn’t be over-rated either.

Wakefield Research conducted a survey showing the scope of this on-going technological evolution. Not too surprising, the report found that 69% of the employees regularly work remotely. Some 21% of them blend environments by working both in an office and somewhere else, such as at home or a communal workspace (Starbucks anyone?). The survey went on to show that a whopping 80% of the office professionals agree that, within 5 years, businesses will not be competitive without using cloud-based apps. Future-proofing means leveraging cloud servers and taking advantage of new technologies as they become available.

MEETING RISING EXPECTATIONS, PRESSURES, AND DEMANDS FOR INCREASED SECURITY

New business models, competitors, and customer preferences emerge seemingly from nowhere. Turn around for a moment, and there are new things to look at. During this age of acceleration, all of us have to stay on our toes. We have to practically reinvent ourselves from Monday through Friday. Companies of all sizes have to move quickly to capture new opportunities. And if you think it’s intense now, just wait until next year and the year after that. Modern technology and its impact on business is moving at an exponential rate.  I’m getting dizzy just thinking about it.

Even as things are moving at breakneck speed, security demands have never been greater. Security is also more challenging than ever.  Check out our previous blog on cyber attacks and ransomware for some not so gentle reminders of how costly cyber attacks can be. IT transformation has increased the opportunities available to would be hackers. And these hackers have their choice of mobile devices, web apps to IoT. New mandates, like the General Data Protection Regulation (GDPB) have raised the stakes for everyone.

As companies increasingly leverage the cloud to store customer data, SOC 2 compliance is becoming a necessity.
START AT THE BEGINNING

So, let’s start at the beginning of any company’s transformational journey. Ask the question, “Can your current technologies help you meet all the requirements in ways that enable you to move quickly and stay on top of your priorities?”

 

Wakefield Research shows that 69% of the employees regularly work remotely and 21% of them combine home and office environments.

MOBILE FORCES

MORE PRODUCTIVITY, WITH LESS STRESS AND IN LESS TIME

It’s becoming more common to see employees working from home or both at home and in the office.  Where ever they choose to plow through their day, they need tools that are smart, fast, seamless. They need to work collaboratively. They need to be open robust programs like Revit, or SoftImage, or After Effects, and use them quickly, seamlessly, and without interruption.  Having apps on cloud servers enable distributed teams to collaborate easily across great distances.  Whatever the scenario, the new IT setup needs to empower your people to get more things done, more easily.

 

KEEP IT SIMPLE

Before making an investment in technology, consider if it adds to the complexity of your workplace or helps reduce it.  Does it help to streamline operations? In other words, does it impose a burden of daily management that diverts attention and resources? Or does it free-up people’s time so that they can focus more on their own work.

 

SECURITY IS A CHALLENGE

The threat of cyber attacks is greater than ever. A breach of security can be devastating. Finding skilled security professionals has never been more difficult. The more complex the IT environment, the greater the security risk. There are more openings for attacks. Consider public networks, mobile devices, and web apps. There are insider threats, phishing, and so on.

Sometimes it may be worth taking on the additional security risk in exchange for exceptional business value. It’s a trade-off that should be factored into the evaluation of your transformation strategy. Keep in mind, if a technology can make security simpler, more transparent, and more effective, that’s an advantage.

Cryptojacking is the unauthorized use of one’s computing devices. It is accomplished by injecting the system with hidden code that immediately starts benefiting third parties. About two-thirds of companies targeted by ransomware attacks have been infected.
LEVERAGE THE FLEXIBILITY TO IMPROVE STRATEGY

It’s a great period of time to be an IT professional or developer. The hybrid, multi-cloud era has brought tremendous freedom and flexibility to what used to be just a metal box and a lot of colorful cables.  Now, cloud technology enables us to provision resources and demand, scale easily, and support users anywhere. Cloud servers also allow for beefed up security and greater performance. The cloud is where data rules supreme.  It’s not under the rug, in the closet, or filed away on hard drives stored in a drawer. We now have a place, seemingly with no limits, to put all the data we’re accumulating (organizations stockpile data but seldom dispose of it).

On the user side of things, cloud computing has given employees the freedom to choose any device, time, or place to work. These various cloud options mean a consistency of quality user-experience.

The prediction is that 41% of enterprise workload will be run on public cloud platforms by 2020. Another 20% will be private-cloud-based, while 22% will rely on hybrid cloud adoption.
NO TECHNOLOGY EXISTS IN A VACUUM

If one of your investments limits the utility of another, it degrades the value of both. A Good strategic transformational designer will always look at the big picture and assess how everything is connected.

When it comes to remaining profitable while future-proofing a company, not everything is about dollars and cents. Considering the ever-evolving workplace, with all its need for mobile applications, collaboration tools, data crunching, and massive amounts of storage. Keeping our eyes on the big picture is necessary if we’re to evaluate ROI accurately.

The true ROI has to do with information technology that advances key priorities such as productivity, reducing complexity, strengthening security, and ensuring choices are available whenever needed.

 

Is Your Business Future Proof?

 

We already wrote about putting you on the [fast track to entrepreneurism]. There are incredible numbers around entrepreneurs making their way into the country. This is especially true in The Golden State. There is another thing that is true. Whether you’re just starting your business or a leader in an established organization, you have to deal with risk.

The ability to mitigate risk in your organization’s infancy is necessary for success. Entrepreneurs juggle the requirements of trying to reduce the impact on taxes. They follow a list of mandatory regulations longer than a CVS receipt. All while trying to find new business and actually deliver what they sell.

More…

An Introduction To Microsoft’s SHAREPOINT

 

We’ve all been part of a meeting or worked on a project that was more of a pain than it should have been. Far too many slides of a PowerPoint presentation. A flurry of document versions flying everywhere. Confusion about which is the most current and who made the last change. Did anyone take meeting minutes?! Some things never change.

Document management solutions exploded onto the scene as organizations struggled with data sprawl. They’ve all got on-premises and cloud flavors. They are all made with the sole purpose of helping to manage documents for your teams. All in the name of collaboration. More…