Blog

The Major Business Advantages for a Remote Workforce
Telecommute, remote work, work from home, flexible location. These are all common terms, depicting the ability to do your job from a location other than the work office. These terms have been on everyone’s mind lately. They’ve joined the lexicon along with words like coronavirus, pandemic, and physical distancing.

The government is closing down operations deemed non-critical. More and more state officials are urging people to stay at home. Companies across the globe have to increase their remote workforce or shut-down altogether. Modern-day technology enables employees to work from home and keep operations afloat. Many positions can make the transition to remote work. These include virtual assistants, customer service, sales, IT professionals, writers, designers, and more.

Many Positions Can Transition to Remote Work. For those that can’t, cross-train your staff and shuffle talent in order to leverage their experience with the company.

A recent article by the New York Times reported that over 158 million Americans have been ordered to stay home due to the Coronavirus. Britain has an even more stringent lockdown policy. They have a country-wide ban on meetings of 2 or more people. It’s not known what the numbers of people working from home are. At least not at the moment. The popular web conferencing SaaS company Zoom noted that it had more active users in the past couple of months than it had all last year.

In a May 5, 2020 article in Forbes magazine, Wayne Rush warns that “telling companies to simply have their employees work from home is easier said than done. Not every company has the resources, the training or even the bandwidth to support an en masse move to remote work. In addition, for many companies, a move to working at home requires a significant shift in their corporate culture, something that may be even harder to accomplish than any physical requirements.” The article goes on to suggest doing some incident management exercises. Well, the time for practicing these disaster responses has ended. The window of opportunity has closed. It is true that, as Jack Gold states in the Forbes article, “companies are really going to struggle.” But overcoming these struggles, whether they’re technical or not, is going to make our companies stronger and better prepared for the future.

PERKS WORKING FROM HOME

There are obvious perks to be working from home. For example, there’s no commute, you can be comfortable, and your pets get spoiled having you home all the time. There are also advantages, which may not be so obvious, for the companies. In this Owl Labs report, we see that in the US alone, 48% of workers were allowed to work at least once a week from home. A whopping 30% could work from home full-time. We see some interesting stats on job satisfaction and pay as well. We’ll get into employee availability, cost-savings, and the technology behind it all a bit later. For now, let’s do a deep dive into the question. Why is a work from home option so beneficial to employees? How does it present such an advantage to the health and prosperity of the company?

 

Those companies that had a remote work policy in place before the pandemic are in a much better position to make the transition.

A remote work environment liberates the totality of the company. No longer are the HR options confined to hiring candidates in one geographic region. You are able to pull job applicants from around the globe. This gives a major advantage in the size of the talent available. Not only the size but the quality of the applicants will go up. So there’s an increased talent pool. You can find the best talent available. You will also tap into a diverse workforce. There’s also an ancillary but real boost to the company’s image.

THE BENEFITS GO BEYOND AN ENHANCED SOPHISTICATED CORPORATE IMAGE

When a company advertises a work from home option, it demonstrates a couple of things. Both come across as sophisticated and attractive. It demonstrates flexibility and agility. It also bespeaks a culture that pushes the edge.

A Fast Company article reports that hiring workers from all over creates more diversity and other possibilities. More expansive regions mean less racial, age, and gender biases. For example, mothers will have an easier time re-joining the workforce after long stretches of staying home. Another major advantage to employers for hiring remote workers is salary. Remote workers don’t get paid less. Cities like New York, San Francisco, Boston, and Washington, D.C. are expensive areas to live in. Companies can hire talent away from their headquarters. Comparable employees can found in locations where the cost of living is much lower. This allows the employee more flexibility when it comes to salary. Companies have more leverage to negotiate.

 

Now is the time for companies to focus on revenue over growth. Remote work facilitates long-term cost savings. The benefits include more leverage to negotiate for talent all over the world.

 

Being able to offer telecommuting options to an employee is an actual company benefit. Telecommuting, when it is available, is listed as a benefit on a company’s website. It’s a perk added to a career opportunity ad. You can often find it alongside retirement options and vacation policies. It is also usually touted throughout the hiring process. There’s a reason for it. Telecommuting is a way to lure those that are familiar with working from home. Some professionals have always wanted to work from home but have never had the option. Those who have worked from home, either partially or full-time, often seek out similar jobs. and companies that embrace this type of culture in their next role. Job satisfaction can come from having a strong remote workforce. This satisfaction yields productivity.

INCREASED JOB SATISFACTION EQUALS INCREASED PRODUCTIVITY

 

The infrastructure fo remote working, including laptop computers for every employee expected to work from home must be in place.

 

Remote workers tend to be more satisfied because of the autonomy it brings. At home, there are fewer distractions (well, in most cases). They have more flexibility in their schedule. Allow employees to be autonomous. They’ll have an increased sense of ownership and freedom. In an office setting, there’s a need to conform to certain things like office attire, hours and a cubicle or desk. The Owl report shows that 71% of remote workers are happy in their current role. Only 55% of non-remote workers are satisfied. Job satisfaction yields productivity. In turn, job fulfillment results in less turnover in the workplace.

Having remote employees means much less overhead. You don’t need the office space. The cost-savings alone are reasons to get behind this movement. The cost of space in San Francisco can be around $80/sf. New York City hovers around $90/sf. The cost incurred for remote working space is of course non-existent. The cost of office furniture is another major factor. A high-end office chair can cost a company between $800 to $1,000. Companies have not provided stipends for home office use and expenses. As the current situation continues, that may change. A good case can be made for on-going telecommuting even after the coronavirus crisis comes to an end. In such a situation, some companies will offer reimbursement programs for home offices.

Some employees have high-speed internet connections at home. Some do not. Some are faster and more reliable than the office network. Embracing work from home, employees tend to use BYOD.

If an employee is operating in their own home, and on their own time, why not let them use their own equipment. BYOD adds more flexibility. Most people make use of their personal devices and computer set up in as much as possible. This is especially true if they have a more powerful laptop than the one issued by the company. Think of a company’s infrastructure. The telephones. The Network. The HAV. These become cost savings when large portions of the workforce do their job from home.

Old technology prohibited the work-from-home option for many businesses. Today, that’s no longer true. Companies can remove any obstacles allowing employees to work from home.

THERE ARE MANY TOOLS TO HELP WITH THE TRANSITION
Technology can no longer be an excuse not to work from home. There are a number of collaboration and communication tools that can handle any workflow.

Look at the hardware available today. The quality of wireless headsets (Plantronics and Jabra) have eliminated background noise. Having a Conference call at home is part of regular business life. There are desks that you can raise or lower as needed. These types of workstations provide better energy levels for those who sit many hours in a chair. Other items include multiple monitors for extended viewing. These are particularly useful for doing design work. There are laptops that fit any task requirements.

Web conferencing software (Zoom, Web-Ex or Skype)s for Business can work anywhere. Attendees have the option to use video or have audio-only meetings. Collaboration is key. Keep employees productive within groups. Keep them communicating. The use of tools such as Slack can keep information flowing.

Slack, a simple SaaS solution incorporates single chat or group-chats. It features system notifications and simple file sharing for your entire organization. The pricing is straight-forward. Telecommuters needing technical help can make use of TeamViewer or RemotePC.

Having your data backed up to the cloud is also important. Your computer is not on the company network. Syncing your work to the cloud is as simple as using Microsoft OneDrive or Google Drive. Time tracking tools can report on how long it takes to work on various tasks. They can tell how long you spend on different web pages.

The coronavirus has provoked an exodus from the corporate office to the home. The coronavirus physical distancing might be short-lived or longer-term. How business leaders manage their remote workers will determine the level of productivity. Communication from managers will have much to do with job satisfaction.

There are many SaaS-based apps available. These applications keep employees engaged and available. They also have the flexibility fo step away for a break. It’s a win-win for employees and their employers.

Job satisfaction and productivity are up because of remote work. The question is how will you institute a proper policy? The details will be different for each business. A recent article in Glassdoor proposes a basic approach. It advocates “adequate technology, disciplinary excellence, and clear communicative instructions.”

Employers now have more options to hire cream-of-the-crop talent. They can focus on skillset over the location of a candidate. Working-from-home gives business leaders more time to focus on productivity and bolstering revenue.

 

Why Every AEC Firm Needs to Move to the Cloud Now

Cloud computing is the future of everything digital. Modern IT environments use it. “Modern IT” is now hosting its infrastructure in some form of the cloud. Moving to the cloud is especially important for architects, engineering, and construction firms (AEC). A 2017 Sage Survey found that most of AEC firms had already moved to the cloud. It was quite a jump from an earlier survey conducted in 2012 when only 16% of construction contractors had migrated to the cloud.

That’s why most of them are on the cloud in one form or another. The AEC industry is highly fragmented, data-intensive, and project-based. Designing, building, and repurposing require all the traditional disciplines you’d expect, but also many ancillary areas such as energy, environment, and waste.

The Journal of Cloud computing: Advances, Systems, and Applications reported that sharing data and supporting coordination between people involved is difficult and reliant on third-party tools to support such capability. “We believe cloud computing provides a more efficient and robust mechanism for individuals within the AEC industry to collaborate and share data. Work is already underway in the AEC sector for developing data and process models to enable greater interoperable working between project participants.”

This research has led to the development of the concept of Building Information Models (BIM) – a design process that looks at a building’s life cycle. The BIM concept helps designers and others see how a building will use resources before it’s built. BIM was an evolution of ideas.  Start with a powerful digital drawing tool and then evolve it into a much more sophisticated program. The software works in partnership with the designer or architect. A set of drawings becomes an interactive database. When the designer draws on the screen, the BIM system computes the properties of the building and even suggest improvements for everything from energy efficiencies to people flow while costing out every conceivable option. Every variable is built into the AutoDesk software. Any design changes are immediately reflected in revised cost estimates. It tells how much energy the modified design will save. The architect is working with a set of drawings and a data model that understands the whole building as a three-dimensional living system. Keep in mind that BIM includes all the information about a building. It should be a complete 4D virtual repository of the data associated with the structure from beginning to the end of its life.

 

Being on the cloud facilitates hiring, and retaining, some of the best talents all over the world.
THE CLOUD ENABLES REMOTE COLLABORATIVE TEAMS to work seamlessly together on complex projects.

Collaborative working environments have been long-standing key aspects of AEC workflows. Traditionally, those collaborative teams had to commute to one centralized location. Today, offering work environment flexibility (home office or corporate office) has become somewhat of an expected perk. This was a trend long before the coronavirus reared its ugly head. Now, there are government mandates pressing the point even more. We’re all being forced to work from our homes. Coronavirus aside, future AEC firms don’t want to have their collaborative teams tied to one physical location. Not any more.

Jennifer Howe, VP of SMMA (an architectural firm headquartered in Boston) and acting president of the ACEC organization, Massachusetts Chapter says, “As much as I don’t want to be working from home, there are times when I need to be working from home. Our IT staff had us set-up to work remotely, but it wasn’t the same as what we have now with the cloud. I can be on my laptop with IronOrbit and see everything the same way as we see it while we’re in the office.”

She recognizes that it’s more of an employee’s market now. “The ability to offer talented candidates the option to work from home is an added incentive to join your team.” That’s especially true when nothing is lost while moving from the office workstation to your mobile device-of-choice working at home. But there are other reasons to migrate to the cloud.

A much more enhanced remote work experience is not the only reason to move the cloud. The biggest, more critical reason, is security. But it can’t be just any cloud solution. , The cloud environment needs to customized to the unique needs of the firm. Jennifer talks about the biggest threat every firm faces. “Ransomware attacks are a tremendous concern. An ACEC Mass member firm had a recent incident where they were hit with a cyber-security breach. That was very concerning to our entire chapter. ACEC actually hosted an informative event where they shared some of the issues that they had. For SMMA, as government contractors, we need to be very protective and careful with the information that we have.”

Just a few short years ago, Google Drive and DropBox were the popular options between those who wanted to share large files. Those options weren’t great at protecting intellectual property. Concerns over security justifiably kept many AEC firms from utilizing them.
In addition to state-of-the-art firewalls, antivirus protocols, malware filters, and encryption, a truly holistic approach to security includes 24/7 monitoring.
Industry-Wide Concern for Security Is At An All-Time High

Carlos Charry is the Director of Technology for SMMA. He says security has been a top concern for everybody. “One of our competitors got hit with ransomware a few years back. It made me look at our own situation and ask, ‘Are we prepared for this?’ I knew we weren’t ready.”

The level of security provided by IronOrbit – the firm’s cloud solution provider is far beyond anything they could have accomplished on their own. The entire IT infrastructure is protected by state-of-the-art firewalls, antivirus protocols, malware filters, and encryption. The security doesn’t stop there. There is an entire team of engineers, rotating around the clock, monitoring the data centers for any type of potential security threat.

But Carlos adds, “The question of security aside, you still have to keep up with technology. That means having your IT infrastructure on the cloud. The cloud provides faster updates. Just keeping all your applications up to date saves you a lot of trouble. Most of my time before the cloud was spent handling IT issues.  Things like the network not being responsive or our server going down. I spent time on things like that and couldn’t devote myself to what I truly love to do which is to improve our business processes. I want to make them better so the company can become ever more efficient.

Carlos continues, “The cloud has enabled us to hire anyone anywhere in the world. The employee just needs a PC and an Internet connection of some kind and they can utilize our tools. We currently have people working for us from Maine and New York. Since we’ve moved to the cloud, my headaches have been reduced. Once an employee is connected to the cloud, I don’t have to worry about it. I know the data is automatically being backed up. My worries are basically gone.”

FINDING THE RIGHT WAY TO COLLABORATE IS CRITICAL TO RUNNING AN EFFECTIVE BUSINESS

Jennifer says, “Working with Carlos, our IT director, we’re always looking for better ways to do our work. SMMA is a full-service design firm. Collaboration is the key to our success. Finding the right way to collaborate internally and collaborate with our clients is a critical part of running an effective business.

MOVING TO THE CLOUD. WHAT IS IT LIKE?

People were hesitant at first. The cloud environment is different from having your server on the premises. It’s different. “As we were going up to the cloud, and trying to figure out how to use it, they weren’t sure at first what to expect. Is it going to make my life better or worse? Finally, through effective collaboration and communication, we found it to be an invaluable tool. I find that I can access whatever I need wherever I am.  One of the things that surprised me was being at a client meeting and just being on wi-fi and act as if I were in the office. I’m able to pull up any document I need at any time. For example, I do a lot of government work. When I’m doing a client visit, I often don’t have wi-fi available to me. No worries. I just turn on the hotspot on my phone and still be able to open up a CAD drawing. You’d think that would be impossible to do, right? But it really works quite well.”

 

Being able to be remote and share a CAD drawing on your laptop using the hotspot on a smartphone is amazing. “You think it’d be impossible, but it actually works very well.”

 

Hector Inirio is the Design Technologist. He says, “That the most attractive aspect of moving to the cloud was a blend of things. There are many aspects of advanced IT that are beyond our expertise such as high-end security threats. Ransomware is a good example. I really liked the fact that cloud technology democratized our computer systems. We’re not transferring any data from our local workstations. The workstations themselves, really become more like dumb terminals. So, no matter what kind the computer was at a particular desk, they all now respond like high-end machines.  Previously, due to cost, we’d only have some users on higher-end machines. The ones who didn’t need the computing power were working on equipment with less computing power. Now, all of them respond with higher specs.”

“I really liked that cloud technology democratized our computer systems. It made all of them perform like higher spec machines” – Hector Inirio

The computer terminals become virtual desktops because they are hosted by the external cloud server. Any slowness or frustrations you’ve experienced with your current Internet connection go away. Once users log in to the hosted desktop you’re using bandwidth from the cloud. There are separate gigabyte connections to the Internet. Your bandwidth virtually becomes unlimited.

The technology needed to aid the construction industry’s complex workflows hadn’t become available until the past few years. There are now plenty of SaaS solutions available to make full use of what cloud technology offers. Most contractors are implementing cloud solutions. The few who are not risk losing any competitive edge they had. These firms are also in danger of becoming irrelevant as technology advances at exponential rates. They simply won’t be able to keep up. Remaining current with the speed of technology means being able to focus on human capital.  These are qualities like talent, skills, know-how, empathy, and creativity. All of these are undervalued human assets to unlock. You won’t be able to leverage this human capital if you’re stuck in the mud because your technology isn’t current.

MAKE FULL USE OF THE BENEFITS

Construction companies already on the cloud should evaluate if they’re making full use of being on the cloud. There is another benefit of cloud computing. Construction companies should be cashing in on the ability to store tremendous amounts of big data files onto more powerful machines. More can be done with fewer resources. Anywhere there’s an Internet connection you’re good to go. Being on the cloud removes hardware limitations, prevents loss of data, dramatically improves security (if designed correctly), and improves accessibility.

One of the key issues within the industry is the storage of building data throughout the whole life of the building. Data processing is also an important concern for the industry. During construction, a large part of the work takes place on-site where computing resources, up till now, have been non-existent.

The cloud offers data processing power. Drones hover over construction sites and take pictures with detailed GPS coordinates and metadata. Stitching these images into an orthograph requires more processing power than typical computers can muster. Visiting job sites can take hours. Now construction sites can be viewed via a SaaS platform. A design captain or engineer can get a real-time view of the location from anywhere in the world, and on any device. This technology also makes sharing data much easier. There’s a misconception that data becomes less secure on the cloud. It turns out the opposite is true. That is if the new cloud environment has been designed with tight security in mind. If the data is kept at a Level 3 Data Center with round the clock monitoring, cybersecurity is on an entirely different level. It’s in a league of its own. One that isn’t possible for on-prem servers or public clouds.

The Remote Work Survival Kit Under the Threat of the Coronavirus

There is no denying the impact COVID-19 has had on us over the past couple of months. The coronavirus has managed to work its way into every conversation, news headline, and social media post.

The coronavirus is a pandemic according to the World Health Organization. The threat of the virus spreading
has changed the way we live. We have to prepare ourselves for the upcoming months. Canceling large events and gatherings is one way to mitigate the spread of the virus. Sports, schools, churches and many businesses have closed. Or they avoid interaction with the public. Social distancing is the new mandate. Government officials have urged us to not congregate in large crowds. Stay at home if possible. Many companies are sending emails to employees asking them to work from home if possible. Companies that aren’t set up to work remotely are scrambling to make it happen. What was once an option has become a necessity.

This article will provide some options on how to deliver a great work from home experience. None of these technologies are new. If used in combination they will ensure a better work-from-home experience.

Let’s start with the one that can take on many forms and methodologies: BYOD. Bring your Own Device. Gartner defines BYOD as allowing someone to use a personally-owned device to access a company’s resources. This could be the company’s email. It could be actually installing a VPN client on their home computer. Each company has a different take on the level of access granted to non-company assets.

 

The “Bring Your Own Device” concept has been around since 2004. It is not a new trend. What is new is the popularity of using personal mobile devices on the job. The security risks of allowing access to corporate resources has discouraged some companies from adopting a BYOD policy.
Bring Your Own Device

In this post by Remote.CO you can get a sense of the varying level BYOD plays at different organizations. BYOD had its start in the mobile device world. Companies were tired of purchasing cell phones for employees. Employees were tired of carrying around 2 phones. Employees carried their personal phone and the locked-down, outdated one provided by the company. Since then, companies have other ways of getting business data secured on personal devices.

Mobile Device Managers

Microsoft Intune and VMware Airwatch are MDM programs that help protect corporate data on personal devices. Employees have access to an Enterprise app store where they can consume their internal data while using their device of choice. The employee first opts in to install the MDM agent on their device. The list of devices with current modern Operating Systems is no longer limited to only smartphones. Once the agent is installed, the company can push down a profile that allows the device to be managed. Both Intune and Airwatch have a robust set of policies available for Windows, macOS, iOS, and Android. What degree of enforcement the company has on the phone will vary on the company and device type. Once the agent is deployed, and the configuration of Security baseline is set, the device can be actively monitored and secured. This could mean enforcing Bitlocker encryption for Windows 10 devices or managing Filevault on macOS with Intune.

Virtual Desktop Infrastructure

VDI technology has taken many forms over the years. In its purest form, VDI is accessing a virtual machine over the network from a client or web-browser. This enables companies to have virtual machines always available on the internal network. These virtual controlled Existing management systems control these machines. Security tools protect the company provided applications and data. Having a proper VDI solution for employees to use can be a major advantage. Especially if they need to travel or work from various locations and/or devices. If a company already has VDI in place today, the process of deploying new virtual desktops is easy. It only takes seconds to accommodate new users.

VDI began as a technology installed on-premise or in a company’s private data centers. Later VDI transitioned to the cloud. The major VDI players Citrix, VMware and Microsoft all have major cloud offerings. This is called DaaS or Desktops as a service. Citrix and Microsoft host their DaaS offerings within Azure. VMware can host desktops in AWS, Azure, and the IBM Cloud. Google Cloud is coming soon.

The ability to leverage cloud-based virtual desktops has great advantages. Especially in certain situations like Disaster Recovery. Traditional VDI takes longer to procure and deploy new hardware. DaaS has some extra benefits like less IT overhead. This is because the cloud provider manages more components.

 

Multi-factor authentication (MFA) is a means of which a computer user is granted access only after successfully presenting 2 or more pieces of evidence (factors) to an authentication mechanism. These are usually having to do with knowledge (something only the user knows); possession (only the user has it); and inherence (like fingerprint voice scan, or retina scan).

Let’s discuss the use of a multi-factor authentication solution. Two-factor authentication (2FA) is a subset of multi-factor authentication (MFA). It ensures you can pass multiple criteria for identity. This includes something you know (password or security PIN). It also includes an object like a security token or fob. Finally, something physical that is specific to you (fingerprint, retina scan, facial recognition). A 2FA solution would offer only 2 of these mechanisms to prove your identity.

We’ve all had to input our email or phone number when signing up for an account online. Using a mobile banking app is a good example. An authentication mechanism can be a one-time-password sent to you via text message. It could be using your phone’s builtin face or fingerprint reader. These are ways to prove your identity.

The FBI warns MFA solutions are not completely foolproof. Still, it’s the best way to thwart cyber-thieves from stealing your data. Having a second form of authentication proof is safer than only having a long password. Most modern smartphones and laptops have a built-in fingerprint or smart card reader. There are several key players in the MFA space. The top leaders include Okta, Microsoft Azure MFA, and Duo (recently acquired by Cisco). Duo uses a simple cloud-based 2FA approach. Their system integrates with various types of applications. When a user attempts to gain access, a VDI or VPN provider sends a push notification to your smartphone. The user acknowledges the push notification on their smartphone. There’s no need to enter a second password or copy a 16-digit PIN for verification.

The order from management is to stay at home. Do not come to the office for the next 2 weeks. Work remotely until government and health organizations deem the coronavirus has been contained. Don’t worry about a report or project plan saved on your office desktop. Embrace VDI technology.

Do Your Work, Anywhere, and on Any Device

 

If you’re new to working from home, make sure your technology is in order. One important aspect of working remotely is communication. Make sure you have the bandwidth needed to support your tasks throughout the day.

The order from management is to stay at home. Do not come to the office for the next 2 weeks. Work remotely until government and health organizations deem the coronavirus has been contained. Don’t worry about a report or project plan saved on your office desktop. Embrace VDI technology.

VDI means working from a virtual desktop every day. Your data is always available, accessible from wherever you are and protected. Your data is more secure now than it ever was when kept on-premises. The data is backed up across different geographic regions within the cloud. There is no need to worry about catastrophic power or network outage at your local data center. It’s also always on and provides a consistent experience whenever you need to access it.

Maybe you don’t need a full Windows Virtual Desktop to get your work done. You just need access to a handful of SaaS apps like Salesforce.com. An Okta or other MFA solution can help authenticate you from an outside connection. This allows you to gain entry to those specific internal resources without the need to install a VPN client.

Or, what if all you really need is to access your corporate email and files on your phone while safe at your home? Having your smart device enrolled in your company’s Mobile Device Management solution can provide the access you need while keeping the business data secured.

Deciding how to start a remote work enablement plan for your team can seem like an overwhelming task. Like other challenges, it can is not so daunting when done in small steps. Better yet, it is a good idea to bring in experts who can design a solution that works best for your business.

There is no one-size-fits-all approach. While there are many ways to enable employees to work from home, there is only one that is perfect for your needs.

Many adversities are beyond our control. It is helpful to focus on those things we can control. We can take steps to prepare for the uncertainties ahead. We can do what is best for our employees and our loved ones.

Using the cloud to work remote is less to do with “social distancing,” and more to do with benefiting your company. Being on the cloud will democratize opportunities for you across the board. You’ll see that remote work is not so much a challenge to overcome, but a business advantage to achieve.

 

Check out IronOrbit INFINITY Workspaces! The Ultimate Remote Work Tool!

 

The Coronarvirus Tests Global Readiness for Remote Work
As the threat of a coronavirus pandemic wipes away trillions of market value dollars, the largest mass exodus from the traditional office is underway.
The coronavirus threat pushes the question, “Are we ready to have our employees work from home?” Organizations want to do whatever they can to help contain the spread of the virus.

One of the top healthcare conferences of the year HIMSS canceled at the last minute. Everyone knows why. The canceled HIMSS conference was only the first of a series of conference cancellations this month. How many more conferences are going to be canceled. Only time will tell. A click survey online shows that Google, Intel, FaceBook and Twitter have canceled many of their conference plans. The South by Southwest, or SXSW Conference, has not yet buckled under pressure to cancel.

Andrew Keshner reports in a MarketWatch article that, “As the Coronavirus spreads, companies are increasingly weighing if they should, or even can, have workers do their jobs from home.” The article goes on to announce that Twitter told its 5,000 employees around the world to work from home. The BBC News reports Twitter’s head of human resources Jennifer Christie said, “Our goal is to lower the probability of the spread of the Covid-19 coronavirus for us – and the world around us.” Twitter has been developing ways for employees to work from home. Their mandate moving forward is to enable anyone, anywhere to work at Twitter. Twitter’s began moving to a more mobile workforce before the coronavirus. Now, many companies are taking steps to enable employees to work from home. Asian-based organizations, the ones that could, have already implemented work-from-home options. Several giant multi-national companies such as Citigroup have restricted travel to Asia.

The Best Advice: Plan and Prepare

The media seems to report on the idea that there are only 2 states you can exist in. One is ignorant bliss. The second is a state of panic. There’s a wide territory between those two extremes. People should not panic. They should be aware of what’s going on, have an appropriate level of concern, and respond. People need to consider what’s going on so that they can take action. Managing risk is an important part of life. It’s also an important part of leading a business. Understand the risk. Understand what might happen, and make decisions to keep business moving.

Centers for Disease Control, or CDC, has announced they can’t contain the coronavirus. So that means we’re down to implementing mitigation strategies. This means the CDC is going for non-pharmaceutical interventions (NPIs). This translates to things like closing schools. Mitigating strategies also include preventing people from attending large gatherings. If necessary, issue self-imposed quarantine orders. If self-imposed quarantines don’t work, CDC will issue a contained quarantine order. This means there’s no choice in the matter.

The CDC recommends that companies encourage telework. “For employees who are able to telework, the supervisor should encourage employees to telework instead of coming into the workplace until symptoms are completely resolved. Ensure that you have the information technology and infrastructure needed to support multiple employees who may be able to work from home.” There have been technologies enabling employees to work remotely for some time now. And the interest has grown over the years. It has been a matter of just deciding to offer that flexibility to your employees. Managers have to determine the ratio of working in the office with working at home.  It’s more of a leadership decision rather than any limitation of the technology. But the coronavirus threat will certainly act as a catalyst accelerating the adoption of remote collaboration tools. Most companies will be forced to have their employees stay home.  Microsoft has announced free upgrades. Office 365 users can now make full use of the video conferencing and recording features of Microsoft Teams.

 

Businesses can replace in-person meetings with video and increase networking options. Now is a good time for businesses of all kinds to start preparing. If you don’t have the infrastructure already in place, start planning it. Most organizations are not prepared for wide-spread enablement of remote departments. Many are still evaluating requirements and solutions. Workers can work as effectively at home than in the office. Research indicates employees are even more productive working from their home offices.

Moving to The Cloud Has never Made More Sense Than Now

Cloud technology and remote workspaces enable organizations to be flexible with their staff. It’s also an attractive incentive while recruiting talented employees. Astute business leaders want to be in a better position to offer remote collaboration tools to their employees. They want to establish parameters in which work-from-home culture thrives. Jennifer Howe, VP of SMMA an architectural firm in Boston, and acting president of the ACEC Massachusetts said,” Remote workspaces are invaluable these days. You can’t recruit and retain talent without that kind of flexibility.”

A recent article on the Fortune website calls it the “world’s largest work-from-home experiment.” There are millions of businesses all over the world trying to stay productive amidst this growing crisis. The article goes into detail on the level of upheaval for companies. This is particularly true in Southeast Asian countries. “One of the most unsettling factors for employees is the rapidly-changing impact of the virus. It is prompting daily changes in corporate directives. We’re seeing that kind of impact in the states as more and more cities declare a state of emergency.

 

A giant experiment is underway to see how well new technologies can enable successful mass remote working for employees.

 

Managers worry the exodus from the office will lower productivity. There have been many studies done to support that the exact opposite is true. Productivity doesn’t go down. It goes up. The 2017 Stanford University Research is often quoted. That study found a 13% increase in productivity. A study conducted at the U.S. Patent and Trademark Office showed remote workers had a 4.4% increase in output. The consulting firm Deloitte did a recent survey that found 82% of white-collar workers using flexible work options.

 

Unlike companies that are designed from the start to hire work-from-anywhere employees, traditional in-office companies have to decide how this will work. Management has to set parameters on how remote work happens.
What Does Remote Work Look Like?

Unlike companies that are designed from the start to hire work-from-anywhere employees, traditional in-office companies have to decide how this will work. Management has to set parameters on how remote-work happens. They have to communicate to their employees what the expectations are. How will the team stay in contact with each other throughout the day? What is the level of responsiveness needed? Does your staff need to access robust programs like Autocad, Maya 3D, or Adobe After Effects? If so, then how, on a technical level, is that going to happen? For example, GPU hungry programs will need to be hosted on a virtual server. The work-in-progress files will have to be stored in some central location. This is also something that isn’t accomplished overnight. Now is a good time to start having those discussions.

The worst thing you could do is not do anything. Business leaders shouldn’t ignore the situation as it continues to escalate. Ask yourself, if this continues, would your company be able to operate productively. To what extent will your company be forced to stop its activity altogether?

At some point, we are all going to enter the coronavirus tunnel and make it through to the other side. The collective experience will force us to redefine the way we work. We will consider how we interact with each other. Who operates as a self-starter? Who needs closer supervision?

Alvin Toffler was a writer, businessman, and futurist He envisioned the digital revolution long before it happened and foresaw the remote workforce as an inevitable 21st Century trend.

The idea of remote work is not a new one. It goes back 50 years. Futurist writer Alvin Toffler wrote about remote work in his 1980 book THE THIRD WAVE. “When we suddenly make available technologies that can place a low-cost “work station” in any home, providing it with a “smart” typewriter, perhaps, along with a facsimile machine or computer console and teleconferencing equipment, the possibilities for home work are radically extended.”

Cloud technology enables a home computer…a “low-cost workstation” as Toffler calls it, or any mobile device for that matter. The home computer, smartphone, or tablet essentially serves as a dumb terminal. The processing power actually comes from a virtual desktop. For all practical purposes, it’s just like working from your office. You have access to the same emails, the same software applications, and the exact same files.

Right now, the coronavirus is forcing us to reconsider work-from-home scenarios. Moving personnel to a more comfortable and safer work-from-home environment has its benefits. For some businesses, this means building some kind of infrastructure.

I’d like to close with a question posed near the end of the Forbes article. “If you are an employer and you have the power to offer greater freedom to your workers, should you not being thinking about how to do so?”

 

 

 

Technology’s Impact on Healthcare

Technology is transforming the way healthcare operates. The impact is not on one level but on many.  It is certainly a game-changer for the way communication happens and the way data is stored. Most importantly, it is truly enhancing the patient experience. Technology transforms the way patients are diagnosed and treated. It’s also transforming the way the business side is handled.

The true dynamo behind the great healthcare overhaul is mobile technology. These are the smartphones and tablets carried by doctors and nurses as they move between one location an another. Cloud technology provides on-demand access to any IT resource you can imagine. It also delivers resources previously unavailable. This blog will introduce some of these new resources. Because these resources make use of cloud computing, they can be accessed from any device anywhere on the planet where there’s an Internet connection. The added benefit; again, because it is on the cloud, is the flexibility and versatility of being able to scale up or scale back capacity as needed. Bandwidth is unlimited. Store as much as you want. Gone are the days of being frustrated with your workstation because it is slow.

There are 2 drivers behind this technology. One is to reduce costs. The second is to improve the quality of patient care.

There are more mobile devices than there are people on Earth. Clinicians are connected as never before. This means that medical professionals can immediately tap into, contribute to, and benefit from, a growing pool of global medical knowledge. At the swipe of a finger, a doctor can access the latest research on a given disease, learn about the latest drug, or clinical trial outcomes. They can benefit from the collective experience of colleagues worldwide.

Things are changing from the patient side as well. Patients are becoming increasingly accountable for their own health and well-being. They’re doing their homework on diseases and illnesses. They want access to their own data. In the June 13, 2017, Forbes magazine article How The Cloud is Transforming Healthcare, Khalid Raza writes, “providers must satisfy the demand for instant, top-quality access to healthcare services. Patients – who are accustomed to the 24/7 availability and service from online retailers and financial institutions – expect and even demand such access and services from their healthcare providers. People have become more involved in managing their own healthcare needs, which only complicates matters, and gravitate to the web for diagnosis, information, and treatments.”

Software companies have had the pulse on these industry-wide healthcare trends. These companies have responded with new technologies designed to significantly contribute to the flow of knowledge and the efficiency of future healthcare.  There are now multiple secure messaging technologies available to doctors who want to have a quick informal consultation with a colleague. These tools have many of the same features. For example, all communication is tracked and logged automatically.

Here are a few of the new technologies that are changing the face of medicine. And they’re all being facilitated by cloud computing in one way or another.

 

DIGITAL FLOWS
SPEED UP
DIAGNOSIS, PROGNOSIS & TREATMENTS

There are still thick heavy reference books collected throughout doctor’s offices and nursing stations. These mammoth books are collecting a lot of dust now. The reference books have probably been forgotten or left where they were simply for reasons of interior design. Now if a nurse or doctor needs a quick reference, they pull out their smartphone. Mobile apps enable clinicians to quickly dial into any information needed about drug interactions or complications associated with a particular condition.

 

The Med360 Mobile App

Med360 is a program that automatically collects every new publication matching your interests. It collects data from thousands of traditional open access journals and funnels it into your personal stream. A doctor has only to call up the app on his or her smartphone, do a quick scan of the screen, and know exactly what’s going on with the patient’s medication history-taking and reconciliation. Pharmacy pickups, dosage changes, and re-fills are presented in a clear interface on the clinician’s mobile device.

 

 

 

 

 

VAST AMOUNTS OF DATA

The February 2019 article in Nature Medicine reported on a program that used patient information such as symptoms, history, and lab results to diagnose common childhood diseases. According to the article, the system was given data on nearly 600,000 patients at a pediatric hospital in China. The results produced by the system were highly accurate.

In another February 2019 article, Cade Metz reported that Google is developing and testing systems that analyze electronic health records in an effort to flag medical conditions such as osteoporosis or diabetes. Similar technologies are being developed to detect signs of illness and disease just based on X-rays, M.R.I.s and retina scans. The main thing these innovations have in common is their reliance on neural networks. This is a breed of artificial intelligence that learns tasks largely on its own by analyzing vast amounts of data.

Computers can be programmed to recognize patterns amongst vast amounts of data. These patterns can be linked to specific conditions. These are patterns that would be difficult, if not impossible, for a person to notice. Huge amounts of data from medical imaging are fed into artificial neural networks. The program follows an algorithm. The computer then proceeds to learn on the job so to speak. The more data it receives, the better it becomes at interpreting the data.

This learning process is already being used in many applications. Computers learn to understand speech and identify objects this way. Self-driving cars can recognize stop signs. It can tell the difference between a pedestrian and a telephone pole.  Google has created a program to help pathologists read microscope slides to diagnose things like cancer.

 

Mobile devices are the key to tapping into knowledge flow streams.

KNOWLEDGE ACCESS

ON

ANY DEVICE ANYWHERE

The fact that everything is accessible on any device anywhere means patients can get medical help at the hospital, at the ambulatory center, and in the comfort of their own home. In the past, if you wanted to see the doctor, you’d physically have to travel to where the doctor practiced medicine and visit the doctor’s office or go to the emergency room.

Now, much of that care can appropriately be pushed into the patient’s home.

 

Telehealth is the distribution of health-related services and information via electronic information and telecommunication technologies. It allows long-distance patient and clinician contact, care, advice, reminders, education, intervention, monitoring, and remote admissions

Hospital at Home, a program at Mount Sinai, enables video visits. You can check-in, access monitoring tools, and input your vital statistics. Patients can do things like check their pulse, blood pressure, or weight. The information can then be sent to the patient’s care team for review and response.

In a May 10, 2019, Harvard Business Review article, Albert Siu and Linda V. DeCherrie report that “research has shown varying but clearly positive impacts on mortality, clinical outcomes, readmission rates, and cost. A 2012 meta-analysis of 61 randomized, controlled trials, for instance, found that the hospital-at-home patients had a 19% lower six-month mortality rate compared to hospitalized patients. Our research finds that patients who receive hospital-at-home care have fewer complications and readmissions; they also rate their health care experience more highly.”

Bruce Darrow, MD, Ph.D. and Chief Medical Information Officer at Mount Sinai in New York.

Bruce Darrow, M.D., Ph.D., cardiologist and Chief Medical Information Officer for Mount Sinai Health Systems says, “It’s empowering for the patient and it’s good for the clinicians too. The technology allows doctors to let the patients do the jobs they would want to do themselves.  Artificial Intelligence is going to be essential to healthcare. When we think about doing the work with patients at growing population levels effectively, A.I. technology is going to play an important role. If I’m a primary care doctor who is taking care of 2,500 patients, only 20 or 30 of those patients will come into my office on any given day. At the same time, there may be several at home who are at risk. Rather than combing through the entire list of 2,500 patients, if I have tools to look at the prior history of the patient along with their current vital signs, I can determine who I need to see first.”

 

Medical record systems are notorious for not communicating with one another.

Darrow goes on to say, “Electronic medical records have been challenging to connect to one another because of the way they were born. The original idea was not to generate a national patient identity that would allow the same patient to be identified as such from one system to another. There was no original standard for what the medical records would do and how they would interoperate with each other.

The government and the healthcare industry have recognized the problem. That’s where the work of the next few years will be. We’re making progress. At this point, I have patients who come to see me in the office. I can pull their information from a number of systems throughout  the New York area as well as nationwide.”

Telehealth

Telemedicine is the practice of caring for patients remotely when the provider and patient are not physically present with each other. This HIPPA compliant video technology enables clinicians to consult with their patients effectively. Patients can follow-up with their doctor through a video visit instead of making the trip to the hospital or clinician’s office. Patients can get an on-demand video visit with emergency trained doctors. A doctor can have virtual communication with a specialist. Or a stroke specialist can be transported in to participate in the care of an emergency room patient. All of these things are possible today.

 

The Main Benefit of VDI
VDI Planning: 4 Key Pitfalls to Avoid
What is VDI?

Virtual Desktop Infrastructure (VDI) enables virtualized desktops hosted on remote servers on the Internet.  Reducing the need for hardware while improving flexibility, VDI offers practical benefits as well as a hefty return on investment. There is a strong business case to be made. According to the IDC, “The Business Value of VMware Horizon,” of January 2016, there is a 5-year return-on-investment of 413 percent. On average, the virtualized desktop costs 71 percent less to buy, deploy, support, maintain, and use over a 5-year period. This is on a per-device basis. Users spend 76 percent less time on device application log-ins. VDI enables companies to make full use of human capital while preventing many IT-related issues. We need all the help we can get to unlock the massive human assets such as talent, empathy, and creativity. You know, the things computers aren’t that good at. There are indeed great advantages to moving to a DaaS environment. There are also many opportunities for making mistakes along the way. Let’s take a look at the 4 most common pitfalls associated with VDI migration.

A TechRepublic article cites a lack of planning as a major pitfall of VDI integration.  The article went on to report that companies failed to plan for enough resources. Don’t provision for today or tomorrow. Design an infrastructure that will serve your needs next year and for the years ahead. That article was from 2013. It is just as relevant today.

Decide what are the priorities in your VDI environment.

The problem with most VDI implementation is lack of planning. Internal stakeholders should begin with a comprehensive assessment of the IT environment. Also, consider the individual desktop environment. The VDI landscape has changed over the years. Planning and project management are the key to a successful VDI adoption. The initial steps start with an internal dialogue. It’s a good idea to bring in outside expert advice early in the process. Each company is unique. There are different demands and different expectations. The time and effort put into VDI planning will pay incredible dividends for years.

Here are a few of the most common hurdles. They can be overcome when identified early.

VDI Planning
A Common problem with VDI planning is wanting to include everything.
Don’t Try to Do Everything at Once

The first common issue in rolling out a VDI initiative is trying to do too much at once. This applies to both large and small environments alike. VDI does not look the same at any two companies.

Don’t try to include every attractive feature in your initial implementation. Be focused on meeting key objectives. And be selective. Understand the major features and benefits of VDI. But don’t try to include everything in the beginning. This will only slow down the process. It will also distract you from your key objectives. A white paper by VMware recommends taking a step back. Consider what you’re trying to do. Do this before you even think about IT requirements. Instead of diving straight into technical requirements, such as numbers of servers and sizing of WAN links, begin by exploring user needs, business drivers, and special requirements. These special requirements might include things like: compliance issues; high availability; disaster recovery plans, or even the need for the business to rapidly onboard large numbers of new users due to mergers or acquisitions.

Don’t get stuck on the age-old VDI question. For example, using non-persistent versus persistent desktops in their initial deployment.

A company may never deliver a useable VDI solution if they allow themselves to get stuck on an idea. Let’s say that you determine 99% of its VDI desktops will be non-persistent. Well, you need to know that you’re going to spend countless OpEx and CapEx funds.

Stay Focused on Key Points
Zero in on what’s most important to you in a VDI environment.

Narrow down what you need in the planning stage to get VDI in a solid usable state. Set-up your VDI on a set of lean criteria. You can make additions as you go.

Do an Effective Initial Assessment

The next hurdle is company-specific. It is also often overlooked due to the upfront cost and time. I am referring to the VDI assessment that should be a part of the planning. The VDI assessment is the discovery phase of the project. It will help you isolate and focus on what is most important for your business.

Identify who will be using the VDI solution. The assessment is two parts: discussion and analysis. Be sure the process includes all the stakeholders including those who will be using the virtual desktops. Getting them involved early in the design process will help manage expectations. It will also go a long way in nurturing the acceptance of the resulting VDI environment.

Bring All the Brains to the Table
Bringing all the brains to the table will ensure the existing infrastructure is understood and all solution options are on the table.

Let’s use the example of an HR group that will be using VDI during the initial deployment. There is an initial interview. The agenda includes setting expectations of VDI. Begin by looking at how the company currently uses the computer environment.

Discussions along these lines will establish some parameters.
Do they generally only use a combined set of 4 applications? Do they work at varied times throughout the day? Do they only need a web browser and the ability to email clients on the company network?

You also need to do some data gathering of what traditional desktops are doing during the day. What are the applications used? What is needed for the machines to operate?

Most PCs are oversized with wasted resources. VDI is all about compute and storage density. Determining accurate sizing needs equals more cost savings. There are several tools that can do the 2nd part of this equation but don’t overlook the first.

Don’t Overlook Management and Support Responsibilities
This third point is around IT staff.

Who will be managing the new environment once the consultants have departed? Will you share this duty between existing desktop/infrastructure teams? Or will a new team arise to manage the entire solution? Decide this early on.

Manage a VDI environment requires an engineer who understands several key technologies. They sound know how these technologies affect the virtual desktop. These technologies include but are not limited to:

Networking  
Know how users connect to the virtual desktop. Know where to troubleshoot problems like lost connections or poor performance

Compute/Infrastructure
Deep understanding of hypervisors and server infrastructure, depending on the vendor of choice

Security
Knowledge of security products will be inside the virtual desktops and in the network path of VD. This is for troubleshooting purposes.

Desktop Engineering
Basic knowledge for customizing Windows installations and troubleshooting.

Additionally, there are several other ancillary technologies that come in handy. These technologies include DNS, Active Directory, Application Packaging/Delivery, Load Balancing, and Storage.

These skills can come from various class training offerings. Many should come from experience. Knowing how all these different technologies work together in your environment is critical.

Larger companies own many of these technologies.
Separate teams manage them. It is crucial that all the stakeholders be aware of the impact of VDI.

Know who has ownership of the new VDI systems. Make sure there is buy-in from across your IT organization. This is important to establish in the beginning. Everyone needs to be on the same page. This will make training easier. can occur for those needing to ramp up.

This ownership and buy-in include first-line defenders like your typical service desk team. Let them know they’re responsible to field certain common VDI related issues as they come in. Provide education and resources to support them. Service and support is the key benefit of partnering with seasoned VDI consultants.

Don’t Forget the User Experience

As VDI deployment comes together, don’t forget about the user experience.

The User Experience Is Important
User experience is the final litmus test. How the user feels about the experience means the success or failure of VDI or DaaS.

Consider how things were before VDI. Chances are, your employees have been using similar pieces of hardware. They know how their workstation machines perform every day (good or bad). They’ll compare the new VDI environment to what they had before.

This goes back to the assessment stage. Understanding the proper-sizing and performance of each machine is important. It can mean the difference between successful adoption and one that isn’t. It’s also more than that.

If a user now has to login twice to access their Virtual Desktop they will complain. If the machine hangs when opening a video conference they will complain. If patches cause reboots on different days, they will complain. You want to make the change over to VDI as seamless as possible.

The experience should be better, not equal or worse than on a traditional desktop. Make sure you plan to provide the expected performance of each workstation. Allow for a tailored storage solution that is intelligent and optimized for VDI. Consider network crashes. If for whatever reason, they can’t access their virtual desktops, this can also be a problem. Here’s the point. Outside factors can contribute to the total experience on a Virtual Desktop. Many of these factors will be beyond your control.

The successful adoption of VDI means user acceptance. Deliver a desktop-like experience. It means proving the training and support necessary. Company-wide buy-in is key to the success of the whole program. It all begins with planning and making sure you have every brain at the table when that happens.

Ransomware Targets Healthcare
The Healthcare Ransomware Epidemic: How to Protect Your Patients
The Problem is Becoming a Crisis

Data breaches are happening at an alarming rate. In fact, the threat of ransomware attacks has become elevated to crisis levels. While there’s increased awareness, attacks are becoming more sophisticated. A variety of large and small organizations are being attacked. No one is immune. The healthcare industry has been and continues to be, prime targets. And for good reason. Healthcare organizations are considered low-hanging fruit by cybercriminals. Hackers know healthcare centers are notorious for having inefficient security. Most hospitals don’t have procedures in place to restore a network once locked by ransomware. Most applications in Hospitals have little or no network segmentation. There are no firewalls between workloads. Basic security protocols are not in place.

Besides the alarming ransomware statistics, there are some attacks that never get reported. The U.S. Department of Health and Human Services experienced 52 data breaches in October. Last year, hackers stole over 38 million medical records. These sobering statistics have made the healthcare industry take notice. Many healthcare organizations are taking steps to increase cybersecurity. But more can be done. This article will take a look at some of the more recent ransomware cases. We’ll look at some mistakes that were made in dealing with cyberattacks. And we’ll offer ways to improve cybersecurity and protect patient data moving forward.

The consequences of a data breach reach far beyond the breaking news story. There’s more to it than the short news article that appears on your computer screen. A single attack can close down an organization for good. It can happen in a few minutes. The consequences can have long-lasting implications. This is particularly true for the healthcare industry. Sure, the reputation of the healthcare center gets flushed down the toilet, but there’s a real impact on the patients. These incidences are not merely expensive inconveniences. Cyberattacks disrupt the entire eco-system of the institution. It puts people’s health, safety, and lives at risk.

 

Healthcare Worker Distressed by Ransomware Locking up IT systems
Security breaches will cost healthcare organizations $6,000,000,000 this year.

 

Often, the healthcare center gets victimized twice. First, there is a ransomware attack. Second, the healthcare system becomes the target of a class-action lawsuit from a community of angry patients and their families.

Consider the New Scientist article about the 2016 attack on the Hollywood Presbyterian Medical Center. It was a Friday afternoon when malware infected the institution’s computers. The attack seized patient data and prevented the staff from further communication. The date was February 5. The same day computer hackers tried to steal 1 billion from the Federal Reserve Bank of New York. It all happened in a matter of seconds. Medical records had to be kept by using pen and paper. They used old fax machines. Patients were sent to other hospitals, operations canceled. The medical center was back on-line after a 2-week standoff. But not until after paying a ransom of 50 bitcoins (the equivalent of $17,000 at the time).

Malware can infect the entire computer system. Someone clicks on a link to a booby-trapped website or opens an attachment in a phishing email. Immediately, malicious malware gets to work encrypting the files. Some malware can immobilize entire IT infrastructures. If data is backed up and you get an attack of malware or something, you can always go back to yesterday’s data.
Healthcare targets often have their backs against the wall during a cyberattack. Because they don’t have their files backed up.

In most cases, a ransom is paid. The hackers deliver the decryption key. And medical centers are able to decrypt the seized files. The Hollywood Presbyterian Medical Center was straight forward. They handled the crisis as best they could. See the above comments about using pen and paper. They negotiated a lower ransom and their data was returned. More recent victims haven’t been so lucky.

Medical malpractice has been part of the healthcare landscape since the 1960s. Now there is an additional risk of medical malpractice during ransomware attacks. If the ransomware attack affects the patient in any way, there will be repercussions.

Doctor Using Tablet
While only a few healthcare systems have policies around using mobile devices, there is a growing movement to regulate such devices.

Take the cyberattack on LifeBridge Health systems. Seven months after the incident, the Baltimore-based health system faced another problem. A class-action lawsuit was filed against them. The lawsuit claimed negligence on the part of the medical center. It also accused LifeBridge of waiting 2 months before informing the affected patients.

LifeBridge had to respond to the allegations. The organization contracted a national computer forensic team to investigate the attack. Patients were offered credit monitoring and identity protection services.

Clearly there are basic mistakes made that contribute to breaches. Mistakes can allow the infiltration to happen in the first place. Resolving a ransomware situation is stressful. People can do things that t make the situation worse.

Ransomware Recovery Mistakes

Health Management Concepts in Florida was attacked with ransomware. The official report was made on August 23. HMC learned about the incident on July 16. The ransom was paid. The attackers delivered the decryption keys. The hospital IT administration immediately took steps to decrypt the data. To their horror, the HMC staff realized they made the problem worse. They accidentally sent files containing patient information to the hackers.

UnityPoint Healthcare had the misfortune of suffering two security breaches in 2018. The second attack compromised the data of 1.4 million patients. At least, that’s the official tally. A series of phishing emails had been made to look like they were from a top executive within the company. An employee fell for the scam. It gave hackers the opportunity needed to penetrate the entire system.

The protection of healthcare assets is not just a matter of protecting patient information but protecting the patients themselves.
Recognizing the Risk is the First Step Toward Protecting Patient Information

The onslaught of cyberattacks against healthcare is relentless. There are inspiring stories of medical centers fighting back. They’re defending themselves against nefarious cyberattacks. They’re saving lots of money. Increasing their efficiency. And better protecting their patients.

One such story belongs to the Interfaith Medical Center of Brooklyn, New York. It’s a 287-bed non-profit teaching hospital that treats more than 250,000 patients every year. They were able to avoid malware outbreaks. Their proactive approach enabled them to detect and respond immediately to advancing threats. Their strategy involved an assessment of threats and implementation of policies and procedures.

Incident response time is critical. Measure it with a stopwatch, not a calendar. All the segmentation in the world isn’t any good if the door won’t be closed in time. Their program was successful. It identified malware infections long before they had a chance to become a problem. They were even able to identify a malware-infected medical device after it came back from a repair vendor.

The Interfaith Medical Center anticipated a ransomware attack and took steps to prepare for it. In a September 3, 2019, Healthcare IT News article, we learn how Christopher Frenz – the VP of Information Security protected the non-profit’s IT system. “One of the ways I approached this was simulating a mass malware outbreak within the hospital, using a custom-developed script and the EICAR test string. Running the script attempted to copy and execute the EICAR test string on each PC within the organization to simulate the lateral movement of a threat within the hospital. Exercises like these are great because they help an organization identify what security controls are effective, which controls are ineffective or in need of improvement, how well or not the staff response to an incident will be, and if there are any deficiencies in the organization’s incident response plan,” he explained.

Christopher Frenz, Interfaith Medical Center's VP of Information Security
Christopher Frenz, VP or Information Security at Interfaith Medical Center, led the charge with his zero trust architecture that protected the network from cyberattacks and saved the healthcare system millions of dollars.
“We have successfully avoided malware outbreaks and are actively detecting and responding to advanced threats, long before they impact privacy or operations.”

Christopher Frenz, Interfaith Medical Center

 

The article ends with some excellent advice from Frenz. “Healthcare needs to begin to focus on more than just compliance alone, as it is far too easy to achieve a state where an organization meets compliance requirements but is still woefully insecure. Organizations need to put their security to the test. Pick solutions that can empirically be shown to improve their security posture.”

 

There are basic steps healthcare organizations can take to minimize their risk of ransomware attacks. Learn as much as you can about ransomware attacks. Consider all possible points of entry. Where is your IT system vulnerable? Medical software used for patient data has numerous vulnerabilities. Healthcare cybersecurity statistics by Kaspersky Security Bulletin found easy access to 1500 devices used by healthcare professionals to process patient images such as X-rays.

 

Improving the cybersecurity of a healthcare organization, whether large or small, has two parts. One part has to do with the design and implementation of the IT system entire (i.e. whether-or-not there’s back-up and disaster recovery features in place). The other part has to do with your human capital.

 

Malware can be introduced from any number of locations along with your network. Often the attack is designed with multiple points of entry. It could be phishing emails where an employee is tricked into clicking on something that is booby-trapped. It could be a bogus email from what looks like an upper-level executive but is actually from a hacker.

 

ON-GOING EDUCATION AND REFRESHER COURSES
Healthcare Employees Being Educated on Cyber Security Procedures
Healthcare employees should have regular and comprehensive cyber threat education. This enables them to avoid falling into traps that can trigger ransomware. It also serves to establish a strong security culture.

Human beings make mistakes. This is especially true in the busy high-stress environments of hospitals. Or in situations where doctors, nurses, and orderlies work extended 10 to 12-hour shifts. People have to be educated about the risks of cyberattacks and what forms such attacks might take. It’s easy for a rushed employee, at the tail-end of their shift, to unknowingly click a file, download an unauthorized software, or be tricked into loading a contaminated thumb drive. There are basic security processes that should be implemented. These are things like creating strong passwords and changing them at regular intervals. Duel factor protection is also a good idea.

Cybercrooks study the vulnerability of humans. Hackers continually figure out ways to exploit human traits and their gullibility. Through social engineering tactics, cyber attackers design pathways to plant ransomware or get a foothold in an information system.

 

SECURITY IS NOT ABOUT QUICK FIXES

Take the time to ensure the staff and vendors are mindful of what they’re doing. Review policies and procedures regarding handling patient data. Review how to avoid security incidences. As we have seen, any data breach has legal ramifications. There needs to be a systematic response that is carefully considered and forged into a process. Additionally, partner with the right vendor who can design and provide a holistic security solution that will protect your patients.

What is the Cloud?

How many of us really know what the cloud is? Oh sure, we know that the cloud involves storing and accessing stuff via the Internet, but do we understand the powerful transformational nature of cloud computing technology. Do we appreciate how it has changed and continues to change, the way we live and work?

Not that long ago if you mentioned the cloud, most people thought you were talking about the weather. As recently as 2012, Wakefield Research discovered that 51% of the people surveyed, most of whom were Millennials, thought that storm conditions could interfere with cloud computing. Later that same year, Business Insider reported only 16% understood the cloud to be a network of Internet-connected devices to store, access, and share data. So if you don’t know that much about the cloud, don’t feel bad. You’re not alone.

Most people, if they think of the cloud at all, know it simply as a place to keep iTunes, archive our favorite movies, or family pictures and videos. Consumers know the cloud as a storage service offered by Apple. Our knowledge of iCloud is usually associated with the company’s invitation to add more space. Then there’s Netflix. Millions of people access feature-length movie titles stored and delivered on-demand via cloud technology. Do you store and share large files via DropBox? Does your office use Microsoft Office 365?

This article won’t be describing the cloud per se. Nor will it attempt to explain the various types and configurations of clouds. But rather a high overview of how cloud technology transforms companies and whole industries. It will explore the way cloud technology changes the way we work with each other all over the world. Technology growth is accelerating at multiplying rates. This acceleration is due to all the technologies blending together into the cloud.

 

The Supernova
The Cloud is a Supernova

 

We use a soft fluffy metaphor like the cloud, but “the cloud” paints a misleading picture in our minds. The Pulitzer Prize-winning writer Thomas L. Friedman, in his book, THANK YOU FOR BEING LATE, prefers to call the cloud “the supernova.” A term originated by Mircosoft computer designer Craig Mundie. Why refer to it as “the supernova” and not “the cloud.” In the world of astronomy, a supernova is the explosion of a star. It’s a huge astrological event; in fact, the largest explosion that takes place in space.

So too, the cloud is an incredible release of energy. The energy reshapes every man-made system that our society has built. And now, every single person on the planet who can access the Internet can tap into its power. The only difference, Mundie points out, is that a star’s supernova only happens once. The computer supernova keeps releasing energy at an accelerating rate. It’s interesting to note that the components that make up the cloud continue to drive down in cost. The cost goes down while performance keeps going up.

Just as the discovery of fire was a game-changer back in the Stone Age, and Electricity lit the way from one century to the next in the late 19th Century, the cloud has fundamentally changed the modern world. There are more mobile devices on the planet than there are people. Soon everyone on the planet will be connected.

Go with the Flow

The cloud has large amounts of digital information moving in every direction. The information travels up and down. The white-water rapid current moves fast and with equal energy. You have to learn to go with the flow if you’re going to thrive. Like maintaining constant homeostasis, you have to go with the flow to keep your balance. You’ll be better equipped to look ahead, predict trends, and respond to the ever-changing market.

The Flow of Knowledge Stocks

In the past, the traditional idea was to go to college. Get an education. Find a job where you can apply that education. Show up. Do the work and you’d be fine. You’d be set for life. The focus was on one person having a stock of knowledge. Today, the focus has shifted to the flow of knowledge. As pointed out in the 2009 Harvard Business Review article “Abandon Stocks, Embrace Flows,” it’s no longer about having knowledge.

As the world accelerates knowledge tends to become outdated at a faster rate. The premium shifts to a focus on updating knowledge. Choice marketable characteristics will be a high level of curiosity, and staying in touch and maintaining the pulse on the latest advancements. As the world accelerates, stocks of knowledge depreciate at a faster rate. This is true for items you buy as well. Notice how quickly product life cycles have compressed. Even the most successful products fall by the wayside quicker than before. We have to continually learn by participating in relevant flows of new knowledge. And it’s not just a matter of diving into the flow when we feel like it. Participation and benefiting from this flow of knowledge requires that we must also contribute to it on an on-going basis.

This is the world of the cloud. This is where workspaces connect globally. Ideas and knowledge are exchanged freely. The so-called little guy can compete with the big guy. In the March 2016 study “Digital Globalization: The New Era of Global Flows” by the McKinsey Global Institute, we see in great detail how the world is more interconnected than ever.

Many enterprise companies are taking advantage of this interconnectivity. They’re leveraging the technology in order to take advantage of the knowledge flows moving around the planet. For example, Friedman describes in his book THANK YOU FOR BEING LATE, how General Electric supplements its internal resources of engineers to run global contests to see who can come up with the best design solutions. One such contest received 697 entries from companies and individuals all over the world.

It’s All About Interconnectivity

This interconnectivity is expanding “instantaneous exchanges of virtual goods.” The cloud enables digitized financial flows to happen at unfathomable rates. The science journal Nature published “Physics in Finance: Trading at the Speed of Light.” It presents an industry driven by ever-increasing speed and complexity. The article reports that more than 100,000 trades occur in less than a second. That’s for a single customer.

High-frequency trading relies on several things. It needs fast computer algorithms for deciding what and when to buy and sel. Live feeds of financial data are needed. And high-frequency trading also requires about $15,000 a month to rent fast links.

Moving faster also increases the likelihood of mistakes. In 2012, a flaw in the algorithms of KNIGHT CAPITAL – one of the largest U.S. high-frequency firms, caused a loss of $440 million in 45 minutes. The algorithm accidentally bought at a higher price than it sold.

Data speedbumps act like traffic cops slowing down the flow of traffic.

Some trading firms established a way to keep the traffic from moving too fast. They introduced a kind of digital speed bump. Slowing down the flows of digital traffic by 350 microseconds. Apparently this was all time traders needed to benefit from faster feeds. The inclusion of a speed bump, all 350 microseconds worth, meant we’ve already surpassed the optimal speed for trading.

Speed & Complexity Are Free

Because information moves much faster now, global markets become more interdependent on each other. Remember when China made some financial missteps in 2015. It caused a ripple effect that stretched across the planet. Americans felt it immediately. On August 26, 2015, CNN.com reported:

“The American stock market has surrendered a stunning $2.1 trillion of value in just the last 6 days of market chaos. The enormous losses reflect deep fears gripping markets about how the world economy will fare amid a deepening economic downturn in China. The Dow, S&P 500, and Nasdaq have all tumbled into correction territory. It is their first 10% decline from a recent high since 2011. The dramatic retreat on Wall Street has been fueled by serious concerns about the fallout of China’s economic slowdown.”

PayPal has become one of the most important drivers of digital finance. The company set out to democratize financial services by enabling every citizen to move and manage money. The explosion of smartphones gave users all the power of a bank branch at their fingertips. The incremental cost of adding a customer is nearly zero. What is common-place for Americans to do, send money to someone, pay a bill, or get a loan, was now simple, easy, and nearly free for 3 billion people around the world. These were the people who would have to stand in hours to change their currency and stand in another line for hours to pay a bill. PayPal doesn’t rely on FICO scores the way a traditional bank or credit card company does. Instead, they use their own big data analytics based on your actual transaction activity on their site. This gives them a more accurate picture of your creditworthiness. The result: instant loans to more people around the world with a higher rate of payback. PayPal is one of the companies eliminating the need for cash. PayPal is also experimenting with “blockchain” for validating and relaying global transactions through multiple computers.

Cloud technology has brought with it a period of adjustment. We need time to absorb, learn, and get used to the idea of working differently. The cloud will make economies measurably more productive. Because of it Individuals, groups, and organizations are now on a level playing field. These individuals, groups, and organizations can shape the world around them in unprecedented ways. And they can do it with less effort.

Leverage & Synergy

There has never been a better time to become a maker, an inventor, a start-upper or an innovator. It’s leverage and synergy in action as never before.

Leveraging Technology

 

Consider some of these examples:

Uber

The world’s largest taxi company owns no taxis

FaceBook

The most popular media owner creates no media

Alibaba

The world’s most valuable retailer has no inventory

Airbnb

The largest accommodation provider owns no real estate

THE DOUBLE-EDGED SWORD

Technology has always been an amplifier of the best and worst of humanity. It tends to magnify our psychological and spiritual condition both good and bad. Cloud technology is a double-edged sword. On one hand, it empowers the individual, groups, and organizations as never before. Companies communicate faster and more fluidly. Small boutique shops can become multi-national enterprises in a short amount of time. More brains are connected globally. The smallest voices can be heard everywhere for the first time.

Alternately, technology can be used to belittle and disempower. Just as the cloud enables builders and makers, it also gives power to breakers. One person can do more damage more cheaply and more easily. Take Navinder Singh Sarao for example. Sarao, operating from one computer on a network connection out of his parent’s house in West London, single-handedly manipulated the U.S. Stock Market into losing a trillion dollars in less than a half-hour. He “spoofed” the Chicago Mercantile Exchange into setting off a terrible chain reaction. Spoofing is an illegal technique of flooding the market with bogus buy and sell orders so that other traders, both human and machine, are fooled into helping the perpetrator buy low or sell high. He had developed his algorithms to alter how his orders would be perceived by other computers.

Big forces can come out of nowhere and crush your business. You’ll never see them coming. The mobile broadband-supernova is a double-edged sword. How it’s used depends on the values and tools we want to put into place.

WE BECOME WHAT WE BEHOLD
We shape our tools and then our tools shape us.

In summation, the cloud, our technological broadband-supernova, is here to stay. It won’t be the same cloud a few months from now, but it’s here to stay. And things will continue to accelerate. It’s going to be difficult for many to keep up. Keeping up may be one of the great challenges facing society in the decades to come.

In answering the question, “Why is the world changing so fast?” Dr. Eric C. Leuthardt states in his “Brains and Machines” blog:

The reason for accelerating change is similar to why networked computers are so powerful. The more processing cores you add, the faster any given function occurs. Similarly, the more integrated that humans are able to exchange ideas the more rapidly they’ll be able to accomplish novel insights.

Different from Moore’s Law, which involves the compiling of logic units to perform more rapid analytic functions, increased communication is the compiling of creative units (i.e. humans) to perform every more creative task.

A great primer for anyone interested in understanding the transformational power of cloud technology is Thomas L. Freidman’s 2016 book THANK YOU FOR BEING LATE: AN OPTIMIST’S GUIDE TO THRIVING IN THE AGE OF ACCELERATIONS.

Ransomware Risk Mitigation: The Desktop-as-a-Service Solution

Ransomware is a dangerous and growing threat. Find out how security-minded executives establish best-in-class protection.

2019 has proven to be an alarming year for cybersecurity professionals and cyber-attacks show no signs of slowing down in 2020.

One cybersecurity firm characterized the rapidly growing pace of cyberthreats across all industries as an “unprecedented and unrelenting barrage”. Within 24 hours of its report, the City of New Orleans and several other municipal organizations fell victim to ransomware attacks.

But it’s not just large-scale enterprises and public institutions that are under attack. Small and mid-sized businesses offer low-hanging fruit for opportunistic cyber criminals, who often use automation to widen their area of attack.

Small businesses, large enterprises, and public institutions alike have all struggled to respond decisively to the ransomware threat. Until recently, executives had few options – and fewer defenses – in their fight against cybercrime. Now, Desktop as a Service (DaaS) solutions offer comprehensive, scalable ransomware protection services to organizations of all sizes.

 

What Exactly is Ransomware and How Does It Work?

 

There are several ways for a cyber intruder to take over your computer system without your knowledge. You won’t know about it until it’s too late.

The typical ransomware attack begins with the stealthy takeover of the victim’s computer. This may be accomplished through phishing, social engineering, or a sophisticated zero-day exploit – the goal is to have access to the network while remaining undetected.

Upon compromising the network, the cybercriminal can begin slowly encrypting important files. Most ransomware applications do this automatically, using a variety of different methods to evade detection. The process may take days, weeks, or months to complete.

Once the ransomware encryption algorithm reaches critical mass, it then locks users out of the network, displaying a ransom note demanding payment for a decryption key. Sometimes the demand is small – on the order of $500 to $1000 – and sometimes the demand reaches into six-figure sums.

Ransom demands are usually for bitcoins. “If one organization is willing to pay $500,000, the next may be willing to pay $600,000.”

Small sums make paying the ransom a tempting option, but a dangerous one. There is no guarantee that the cyber attacker will relinquish control of the network. Instead, executives who pay up reinforce the cybercriminal profit cycle. It is only a matter of time before the ransomware attacker strikes again.

Famous examples of ransomware variants include WannaCry, which spread to over 230,000 computers across 150 countries in 2017, and Petya. The WannaCry crisis targeted healthcare clinics and hospitals, causing untold damage and highlighted the risk that outdated IT systems represent in these industries.

Petya was unique because it did not encrypt specific files. Instead, it encrypted the local hard drive’s Master File Table, rendering the entire device unusable. There are dozens of other variants out there, and each one uses a unique strategy to take advantage of victims. NotPetya developed on Petya’s attack method, using the same vulnerability previously exploited by WannaCry.

Who Is At Risk of Ransomware Attacks?

 

Emsisoft reports that during the first half of 2019, 491 healthcare providers were hit with ransomware. The attacks are increasing and the demands are for larger ransoms.

Everyone. Although high-profile targets like hospitals and municipal institutions make headlines, thousands of business owners are defrauded every day. On average, one business falls victim to ransomware every 14 seconds.

Small and mid-sized businesses are especially vulnerable because they typically do not have access to the kind of comprehensive security resources that large enterprises can afford. Small businesses that do not rely on reputable third-party managed service providers make especially easy targets.

Cybercriminals have shown that they are willing to target hospitals and public institutions without shame. The greater the need for functioning IT systems is, the more likely the cybercriminals are to get paid. This is how the cybercrime profit cycle perpetuates itself.

What Can Small and Mid-sized Businesses Do About Ransomware?

 

Organizations caught unprepared have few options. Although cybersecurity experts correctly warn against paying the ransom, desperate business owners often pay anyways. But the relief is only temporary. 60% of small and mid-sized businesses victimized by cybercriminals do not recover and shut down within six months.

Preparation is key to successfully resisting a ransomware attack. Organizations that cannot afford to develop, implement, and deploy state-of-the-art security resources need to contract a reputable third-party vendor for the purpose.

Even enterprise-level organizations with tens of thousands of employees often find themselves opting for a managed solution instead of an in-house one. The cybersecurity industry is experiencing a widening talent shortage, making it difficult even for deep-pocketed businesses to hold on to their best security officers.

Introducing IronOrbit: Comprehensive Ransomware Protection

IronOrbit achieves best-in-class ransomware protection through a unique approach to cloud desktop hosting. Three key processes must work together flawlessly to guarantee ransomware resilience:

1.   Prevention

The best way to prevent a ransomware attack from taking place is preventing the initial malware deployment. Firewalls, email filters, content filters, and constant patch management all play a critical role in keeping malicious code out of DaaS systems.

Maintaining up-to-date software is more important than most executives and employees realize. Since NotPetya used the same attack vector as WannaCry, its victims entirely consisted of individuals and businesses who neglected to install security patches after the WannaCry crisis.

2.   Recovery

There is no way to guarantee 100% prevention. However, business owners and their IT teams can circumvent the damage ransomware causes with consistent backup and restoration tools. IronOrbit’s disaster recovery features can wind back the clock, reloading your entire suite of business systems to the state they were in just before the attack occurred.

3.   Remediation

Ransomware recovery cannot guarantee business continuity on its own without best-in-class remediation tools. Without the ability to trace the attack to its source in a fully logged environment, there is no way to tell whether the attack has been truly averted or not. IronOrbit uses state-of-the-art digital investigation tools to track ransomware attacks to their source and mitigate them.

Schedule a Consultation with an IronOrbit Security Expert

IronOrbit has helped numerous businesses capitalize on the efficiency and peace of mind that secure DaaS solutions offer. Protect your business from the threat of ransomware with the help of our expertise and knowledge.

 

The Top Cloud Solutions Every AEC Firm Should Be Using in 2020

The quantity and quality of cloud offerings have grown significantly in the last few years. There are a number of new solutions available to AEC firms that are especially worth taking a look at. We’ll look at a few of the top cloud solutions in this article.

The AEC industry faces incredible growth and changes in urbanization and globalization. As traditional data centers become insufficient to meet these demands, so does the demand for security and efficiency. Trends such as hyper-automation, the distributed cloud, and practical blockchain are just some of the trends that will continue to proliferate into 2020. Each one of them has the ability to transform and optimize initiatives.  The AEC sector has done a good job of keeping up to date. More than 2/3 of AEC firms store data in the cloud. The reason: cloud solutions are an important ingredient to maximizing workflow, costs, and sustainability.

Cloud storage, for instance, can be a more cost-effective alternative to spending thousands on upgrading your local IT. Cloud computing has become increasingly popular for a number of reasons: it’s more affordable, able to perform computationally intensive work, workspace flexibility, and more secure.

Also, there is the added advantage that, with cloud-based computing, it’s possible to view and work with complex renderings on an underpowered device.

Scalability is also easier in the cloud. Most providers allow for scalable, on-demand resource usage. This enables your company to have more computing power when it’s needed.

Cloud storage providers enable benefits that are impossible to duplicate. A Cisco report suggested, “By 2021, 94 percent of workloads and compute instances will be processed by cloud data centers.”

Firms that have moved to the cloud have an edge over the competition. With that in mind, let’s take look at what kind of cloud services are available. Let’s also consider how each one benefits an AEC firm.

 

1 – Cloud Storage

It’s less expensive than storing large CAD files on premises to house large data with a cloud storage provider

 

Cloud storage is a great solution because it is simple and offers several important benefits.

First, cloud storage providers backup your data. Cloud storage companies will often have servers in two different parts of the world. One server might be on the West Coast of the United States. The other server might be on the East Coast so that even the largest disaster won’t wipe out your files.

Files stored on the cloud can be accessed from anywhere. Whether your crew is working 10 or 1,000 miles away they’ll have easy access to everything stored online. It’s also simple to set permission levels on various files. For example, an administrator can dictate that renderings can be viewed at the job site but only edited in the office.

 

Everyday examples of cloud storage providers include DropBox, Google Drive, Sharepoint and One Drive.

2- Cloud Storage Gateways

Cloud storage gateways can help to reduce costs in a number of ways. Data compression reduces bandwidth which enables increased storage. These cloud storage gateways can make smart decisions about where to save files. Files that accessed frequently) are called “hot files.” Hot files can be more expensive to store online. A gateway may keep them in local storage while storing more infrequently used files in the cloud.

Panzura

Panzura has a cloud-based version designed to create a shared environment for everyone to work in. With Panzura users can store CAD and BIM files online and open them in a matter of seconds, not minutes. All files can be accessed from any location making collaboration easier. Panzura claims to “reduce infrastructure costs up to 70%,” over traditional data centers.

Like all cloud storage service providers, Panzura makes it easy to share files and collaborate across a variety of devices.

Another of Panzura’s interesting features is the work-sharing monitor. Work-sharing allows for remote viewing of another Panzura user’s workstation. As with other cloud-based solutions, Panzura can scale as your firm grows.

3 – Cloud-Based Accounting and Management Applications

Running your firm’s accounting and management applications in the cloud simplifies workflow. Collaboration is fluid when everyone at the company has access to the files. Accessibility to files while at the office, at home, or at a hotel room on the other side of the country.

Decreased IT and Hardware Costs: You’ll be able to significantly reduce IT and hardware costs when you host your Sage or QuickBooks software off-premise with a cloud hosting provider.

The cloud enables collaborators to work in real-time. Different permission models can be set for different users. It is also easier to collaborate with suppliers, distributors, and contractors. Since the files and data are already online it’s simple to give outside parties access. That’s opposed to localized data which is more difficult to share.

Deltek is a cloud-based solution to track projects. Project steps can be broad or detailed. Deltek tracks billable hours, resource usage, and expenses. If your firm uses different programs for different purposes, it may be time to consolidate.

While localized programs have offered project tracking for years. Deltek makes collaboration easier for the whole team. Everyone from accounting to the drafting team has access to the same program. The functionality is the same whether they’re on a $5,000 workstation or a $200 smartphone.

 

4 – Internet of Things (IoT)

The number of Internet-connected devices is growing at an exponential rate. As microchips and transmitters are becoming more affordable, more applications and innovations are being introduced onto the market. This means more tools for the AEC industry to track and improve efficiency.

The whole Internet of Things (IoT)

Internet-connected GPS devices, for instance, are great at tracking fleet mileage. They can also make recommendations about necessary vehicle maintenance. There is a Bluetooth tag that attaches to a piece of equipment making it easy to locate on a crowded job site. The tag also helps recover lost tools.

 

5 – Hosted Desktops

Hosted desktops transform computers into more powerful workstations without having to purchase expensive PC hardware.

A hosted desktop is ideal for AEC firms running multiple AutoCAD workstations. Scaling is also a breeze as hosted desktops can increase their resources to handle any task.

A hosted desktop transforms cheap laptop or tablet devices into a powerful workstation. The kind of devices that can launch power-hungry programs and model complex drawings. That makes it great for the crew out in the field who don’t normally have access to powerful computers.

 

IronOrbit INFINITY: The All-in-One Solution
The all-in-one solution offered by IronOrbit provides peace of mind, increased agility, and true synergy with key organizational objectives.

In a 2017 article entitled 7 Reasons Why AEC Firms Need Cloud Software, Eliza Fisher points out “cloud-based platforms are more appealing than hosting project management software because they do not require users to rely on their own computing power. There are quite a few other benefits to consider if you’re thinking of adopting a cloud solution for your AEC firm.” The article goes on to list the following benefits:

  1. Cost-Efficiency
  2. Safety & Data Loss Prevention
  3. Easy Installation & Maintenance
  4. Remote Access
  5. Streamlined Work
  6. Scalability
  7. Insights, Audits, & Compliance

While it’s true that moving to the cloud will have positive impact on the way you do business, it’s important to shop around for the service provider that best fits your company’s future strategy. Keep in mind, not all clouds are created equal. Cloud-based products like our INFINITY Workspaces offer tremendous functionality and flexibility.

Our GPU-Accelerated INFINITY Workspaces combine the best features of cloud-based solutions into one place, including:

  • Hosted desktops
  • Cloud storage (including Panzura integration)
  • Application hosting (any application, including accounting and ERP software)
  • Unlimited computing, upgrades, and bandwidth
  • Managed backups and disaster recovery
  • Managed security and compliance
  • 24/7 US-based IT support

The GPU-Accelerated INFINITY Workspaces allow access to CPU and graphics-intensive applications from anywhere and on any device. End-users will enjoy the fast uptimes, zero latency, and incredibly fast performance..

With unlimited CPU and RAM upgrades you never have to be concerned that you’ll run out of processing power and collaboration is easy when your whole team is working in a centralized environment.

Centralization allows the team to work with the same version software. The whole team uses the same application and work from the same set of files. There’s no longer the concern that employees are accessing different versions.

Inconsistent files can cause delays, accidents, compliance violations and more. Having the team working from the same set of files is a considerable benefit.

Then there’s the convenience of dealing with one vendor. Forget about tracking costs from multiple service providers. There’s no dealing with half a dozen account managers and support teams. IronOrbit gives you everything in one package so that you’ll only ever need to work with a single company.

An Easy-To-Manage and Future-Proof Solution, Too

Speaking of the all-in-one package, IronOrbit wants to make cloud management as simple as possible. They take care of setup, backups, application security as well as user support.

Other cloud solutions require the client to do a number of things. They have to set up the service. Integrate the service with existing infrastructure, and manage it on an ongoing basis. With IronOrbit these time-consuming processes are gone.

When considering a cloud service provider, it’s important to think about the future. IronOrbit is backed by today’s cutting-edge technology. It’s flexible enough to support any future technology growth.

IronOrbit is constantly upgrading its infrastructure. Their clients can take advantage of unlimited GPU and RAM; unlimited bandwidth, and unlimited upgrades to the latest versions of Windows and Microsoft Office.

Whatever devices and operating systems the future holds, IronOrbit’s GPU-Accelerated INFINITY Workspaces will be ready to support them and deliver the same level of service that you’d expect from a professional cloud-solution package. With INFINITY’s centralized, all-inclusive and convenient software you’re always in good hands.

In Thomas Friedman’s book THANK YOU FOR BEING LATE, he refers to the exponential acceleration in technology based on Moore’s Law: the speed and power of microchips double every 24 months.

In this age of acceleration, companies have 3 major objectives. They want to do more. They want to move faster. They want to do it at less cost.

To achieve all 3 objectives, a company has to ensure one thing. That there is alignment between their business objectives and their IT capabilities. Harmonizing clear objectives with IT capabilities, companies will realize their full potential. Not all companies are able to follow that path on their own. They may know they need to change, but don’t know how to get there.

If they do know how to get there, they don’t know who will manage the new environments. It’s more common now that company leadership won’t have a clue of what their future state should look like. Our primary job is to listen to the customer. We need to understand our client’s objectives and future-state desires. Then we analyze the requirements against IT capabilities. It has to be a solid yet flexible enough infrastructure that will support the future business goals. To the degree this harmony between goals and technology is in place, the intended transformations will happen.

It’s Time for an Upgrade

In the last couple of years, there have been a lot of exciting developments in the AEC-cloud-industry. From time and maintenance tracking to hosted desktops allowing users to render complex projects on their iPhones, the cloud has much to offer.

Not to mention savings, when you factor in the thousands or tens of thousands of dollars you can save by not having to continuously upgrade your workstations.

If you haven’t looked into moving to the cloud yet there has never been a better time. The AEC industry is heading toward cloud computing faster than ever. No firm wants to play catch up to its competitors.