The Best Practices for Email Security Webinar

Last week we kicked off our Webinar Series.

Modern attacks are rapidly growing in volume and sophistication.

All it takes is for one employee to open a single malicious attachment, or click one infected link, and your company’s entire cybersecurity is put at risk.

Due to these actions being so simple and common – opening an email, following a link to a website – it is becoming more difficult for companies to increase their security. However, with the addition of robust employee user training, your business can reduce the risk of staff members opening the door to digital attackers. Modern attacks are rapidly growing in volume and sophistication

Watch our Webinar on Best Practises for Email Security to learn important ways to protect your business. 

Hosted by Mark Lukie from Barracuda Networks alongside our Senior Solutions Architect, Jarred Jenkins.

Stay tuned for more upcoming webinars.



Introducing Barracuda Sentinel

Introducing Barracuda Sentinel

Microsoft Sway

Get Started with Microsoft Sway

New to Microsoft Sway? This article provides an overview of how easy it is to create and share anything with Sway’s colorful and interactive canvas. Learn how to create and preview your first Sway, how to add and embed content, and how to share your finished creation with others.

What is Sway?

Sway intro video - click image to play

Use Sway to reimagine how your ideas come to life

Sway is a new app from Microsoft Office that makes it easy to create and share interactive reports, personal stories, presentations, and more.

Start by adding your own text and pictures, search for and import relevant content from other sources, and then watch Sway do the rest. With Sway, you’re no longer limited to picking a pre-designed template that makes your presentations look like everyone else’s, and you don’t have to have any design skills to transform and showcase information in modern, interactive, and attention-getting ways.

With Sway, there’s no need to spend lots of time on formatting. Its built-in design engine takes care of making your creation look its best. If the initial design doesn’t quite match your taste or mood, you can easily apply another — or fully customize your layout to make it your own.

It’s super easy to share your finished Sways. Family, friends, classmates, and coworkers can see your creations on the Web without signing up, signing in, or downloading anything. And you can change the privacy settings for any Sway whenever you want more control over what you share.

Sway is free to use for anyone with a Microsoft Account (Hotmail, Live, or Outlook.com). You can create more sophisticated Sways with more content when you use Sway as part of an Office 365 subscription. For more information, see Add more content to your Sways with Office 365.

What can I create with Sway?

Whether it’s a report, a presentation, a newsletter, a personal story, a photo album, or a visual trip report, there’s virtually no limit on what you can express creatively with Sway.

Reports Presentations Newsletters Stories
             Reports          Presentations           Newsletters               Stories

If you’re not sure what’s possible, you can get inspiration by viewing and interacting with Sways that other people have created. After you’ve signed in to Sway (see below), scroll down to the bottom of the My Sways page, and then browse through the featured content under the heading “Get inspired by a featured Sway.” You can also choose to start with one of the featured templates to begin using and learning Sway.

Sign in to start creating

To get started with Sway, visit www.sway.com in any browser and then click Sign in on the top menu bar.

Sign In button on the toolbar

When prompted, enter the email address you want to use with Sway. You can use your free Microsoft Account (Hotmail, Outlook.com) or any organizational account given to you by your work or school. If you don’t already have an account, visit www.microsoft.com/account to sign up for free.

On the My Sways page that opens after you’ve signed in, click or tap Create New to start creating your first Sway.

Create New button on the My Sways page

Get to know the Sway Storyline

The Storyline is where you type, insert, edit, and format the content that tells your story. Content is arranged in sequential order by adding so-called “Cards,” each of which holds the type of content you want — such as text, images, videos, and even Office documents. The order of cards can be rearranged at any time to suit your needs.

Cards task pane and Sway Storyline

Give your Sway a title

Click the Title your Sway placeholder text shown in the first card on the Storyline, and then type a short but meaningful description of what your Sway is all about. When you later share your finished Sway, this title will be the first thing that others will see.

Title prompt on the Sway Storyline

Add images and text to your Sway

To add basic content to your Sway, such as text and images, click or tap the <+> icon in the bottom left corner of any existing card, and then choose the type of content you want to add. To see all available options, click Cards on the top menu bar. If you prefer, you can also drag and drop text and images right onto your Storyline. (Don’t hesitate to experiment — you can change the order of your content at any time and customize each card the way you want.)

Add images and text to the Storyline

Add content to your Sway

You can easily search for and add additional content to your Sway, such as an image that is stored on your computer or mobile device. Sway can also search the Web for the most relevant content, such as videos and tweets, and add it to your Sway. On the top menu bar, click Insert, select your preferred content source from the menu, and then enter any search keyword or phrase into the Search sources box.

Insert menu and content search box

Preview your Sway

You can preview your work in progress at any time by clicking the left-facing arrow next to the Preview pane near the upper right. When you preview your Sway, you can see how it will appear to others when you later decide to share it. To fully experience your Sway (including any interactivity options that you’ve added on the Layout menu), click the Play button on the top menu bar.

Preview the current Sway

To return to your Storyline when you’re done previewing your Sway, click the right-facing arrow near the upper left.

Exit Preview to Storyline view

Change the design and layout of your Sway

Sway lets you focus on what you’re trying to communicate by taking care of the formatting, design, and layout of your content. You can keep Sway’s suggested default design, select and apply your own, and even customize the layout.

To choose a design for your Sway, click Design on the top menu bar, and then select the theme you want. To choose a random look and mood for your Sway at any time, click the Remix! button on the top menu bar until you find a design that suits your taste. You can also adjust a specific part of the currently applied theme, such as color, font choices, and the emphasis of animation by clicking the Customize button in the Design pane.

Design and Layout options in Sway

If you want to control how others will view and navigate your Sway once you’ve shared it, click the Layout button on the top menu bar and then select whether your content should scroll vertically, horizontally, or appear like a presentation.

Share your Sway

Ready to share your Sway with the world — or perhaps just with selected people? Click the Share button on the top menu bar and then select how you want to share your Sway. Your choices on this menu depend on the type of account that you used to sign in to Sway.

Sharing options in Sway

For a more detailed look at all available sharing options, see Share your Sway.

Go mobile with Sway

Sway works in all modern mobile browsers, regardless of platform. Whether you’re taking the sightseeing trip of a lifetime, documenting research for school, or taking part in an important business conference, Sway is always just a tap away.

  • Sway.com
    Visit www.sway.com in any mobile browser, no matter what the platform or device.
  • Sway for iOS
    Find the free Sway app for your Apple iPhone or iPad on the App Store.
  • Sway for Windows 10
    Find the free Sway app for your Windows 10 device in the Windows Store.

Discover Accessibility features in Sway

The browser that is used to author and view a Sway determines the Accessibility features that are available. For best results, we recommend Internet Explorer, Firefox, or Safari.

You can use Sway in a high-contrast mode with full keyboard functionality and screen reader access to your content. Click More Options ( . . . ) on the top menu bar, and then click or tap Accessibility view. To quit Accessibility view, use the same command again.

For more information about Sway Accessibility, please read the following:

Microsoft Teams rolls out office 365 to customers worldwide

Microsoft Teams rolls out to Office 365 customers worldwide

This post was written by Kirk Koenigsbauer, corporate vice president for the Office team.

Today, during a global webcast from Microsoft headquarters, we announced that Microsoft Teams—the chat-based workspace in Office 365—is now generally available in 181 markets and in 19 languages. Since announcing the preview in November, more than 50,000 organizations have started using Microsoft Teams, including Accenture, Alaska Airlines, Cerner Corporation, ConocoPhillips, Deloitte, Expedia, J.B. Hunt, J. Walter Thompson, Hendrick Motorsports, Sage, Trek Bicycle and Three UK. We’ve also introduced more than 100 new features to deliver ongoing innovation and address top customer requests.

With more than 85 million active users, Office 365 empowers individuals, teams and entire organizations with the broadest and deepest toolkit for collaboration. Office 365 is designed to meet the unique workstyle of every group with purpose-built, integrated applications: Outlook for enterprise-grade email; SharePoint for intelligent content management; Yammer for networking across the organization; Skype for Business as the backbone for enterprise voice and video; and now, Microsoft Teams, the new chat-based workspace in Office 365.

Microsoft Teams—the chat-based workspace in Office 365

Microsoft Teams is a digital workspace built on four core promises: chat for today’s teams, a hub for teamwork, customization options and security teams trust.

Chat for today’s teams

Microsoft Teams provides a modern conversations experience, with threaded, persistent chat to keep everyone engaged. We’ve rolled out many new communication features since preview, including audio calling from mobile devices, plus video on Android, which is coming soon to iOS and Windows Phone. And we’ve addressed numerous customer requests, adding the ability to email a channel, including attachments, send messages with markdown-based formatting, and receive notifications about all posts in a channel.

Move a conversation from email into Microsoft Teams with rich formatting, including attachments.

Hub for teamwork

The Office 365 applications and services that people use every day—including Word, Excel, PowerPoint, OneNote, SharePoint and Power BI*—are built into Microsoft Teams, giving people the information and tools they need. We’ve recently added support for open, public teams within an organization. We’ve also enhanced the meeting experience by adding scheduling capabilities, integrating free/busy calendar availability for team members, adding recurrence, and making it easier to transition from chat to high-quality voice and video.

Ad hoc and scheduled voice and video meetings right from within Microsoft Teams.

Customizable for every team

Every team is unique, so we’ve made it easy for teams to customize their workspace with Tabs, Connectors and Bots. More than 150 integrations are available or coming soon, including Bots from hipmunk, Growbot and ModuleQ. We’re also partnering with SAP and Trello to build new integrations. SAP SuccessFactors will help employees and managers track goals and performance as part of the way they work in Microsoft Teams every day. Trello will empower teams to easily get projects done with boards, lists and cards right within Microsoft Teams. These partnerships let users bring important apps and services into Microsoft Teams, truly making it their own hub for teamwork.

New Bots help you to complete tasks within your conversations.

Security teams trust

Finally, Microsoft Teams is built on the Office 365 hyper-scale, enterprise-grade cloud, delivering the advanced security and compliance capabilities our customers expect. Microsoft Teams supports global standards, including SOC 1, SOC 2, EU Model Clauses, ISO27001 and HIPAA. We also added support for audit log search, eDiscovery and legal hold* for channels, chats and files as well as mobile application management with Microsoft Intune.* And starting today, Microsoft Teams is automatically provisioned within Office 365.

These security and compliance capabilities are critical for enterprise customers, but our responsibility at Microsoft goes beyond this. Our mission is to empower every person and every organization on the planet to achieve more. With that in mind, we’re working to ensure every team member can participate, with new accessibility features, including support for screen readers, high contrast and keyboard-only navigation. This will enable Microsoft Teams to be more inclusive and tap into the collective brainpower and potential of every person.

Customers achieve more with Microsoft Teams

We’re thrilled by the enthusiasm of customers like Trek Bicycle, who’ve built Microsoft Teams into the way they work every day.

“Across Trek’s global teams, the integrated collection of Office 365 applications serves up a common toolset to collaboratively drive the business forward. We see Microsoft Teams as the project hub of Office 365 where everybody knows where to find the latest documents, notes and tasks, all in-line with team conversations for complete context. Teams is quickly becoming a key part of Trek’s get-things-done-fast culture.”
—Laurie Koch, vice president of Global Customer Service at Trek Bicycle

Myths About Moving to the Cloud

What all SMB’s need to know about moving to the cloud
Office 365

Myth 1: Office 365 is just Office tools in the cloud, and I can only use it online.
Myth 2: If our data moves to the cloud, our business will no longer have control over our technology.
Myth 3: Keeping data on-premises is safer than in the cloud.
Myth 4: I have to move everything to the cloud; it is an all-or-nothing scenario.
Myth 5: Cloud migration is too much for my business to handle.
Myth 6: Corporate spies, cyber thieves, and governments will have access to my data if it is in the cloud.
Myth 7: Skype and Skype for business are one and the same.
Myth 8: Email isn’t any simpler in the cloud.
Myth 9: Continuously updating Office 365 will break my critical business applications.

Myth 1: Office 365 is just Office tools in the cloud, and I can only use it online.

FACT: Office 365 is a suite of cloud-based productivity services, which can include:
  • Office 365 ProPlus or Office 365 Business – the Office desktop client you already know and use, including Microsoft Word, Excel, PowerPoint, Outlook, and OneNote, with the added benefit of being licensed, deployed, and updated as a service. These applications are installed on your device so they’re available even when you are offline. And you have the option to store data in the cloud.
  • Exchange Online for email and calendaring.
  • SharePoint Online and OneDrive for Business for collaboration, websites, workflows, and enterprise file sync and share.
  • Skype for Business for voice, IM, meetings, and presence.
  • Yammer for social collaboration.

 

Myth 2: If our data moves to the cloud, our business will no longer have control over our technology.

FACT:
  • When you move to the cloud, headaches and time spent maintaining hardware and upgrading software are significantly reduced. Now you and your team can focus on the business rather than being a repair service. You have more time to spend improving business operations and launching agile initiatives.
  • Instead of spending ever-larger portions of your capital budget on servers for email storage and workloads, you can think strategically and support business managers in a much more agile fashion, responding to their needs quickly.

Myth 3: Keeping data on-premises is safer than in the cloud.

FACT:
  • It’s becoming increasingly clear that your on-premises systems aren’t inherently more secure than they’d be in the cloud, says Mark Anderson, founder of the INVNT/IP Global Consortium, a group of governments and security experts solving the growing cyber theft problem. Many companies are routinely hacked and don’t know it, says Anderson, a tech visionary and founder of Strategic News Service.
  • Security has grown into a full-time job, one requiring a team of experts, and the few experts available require hefty salaries. Microsoft hires the best and brightest when it comes to thwarting security breaches, and we have the scale most companies can only dream about.
  • To keep Office 365 security at the pinnacle of industry standards, our dedicated security team uses processes such as the Security Development Lifecycle; traffic throttling; and preventing, detecting, and mitigating breaches that many companies don’t have the resources to ensure. And, Microsoft Office 365 has a 99.9 percent financially backed uptime guarantee.
  • Additionally, we staff industry-leading regulatory compliance experts. We know and keep up to date with the latest regulations and rules: HIPAA and Sarbanes-Oxley, Federal Information Security Management Act (FISMA), ISO 27001, European Union (EU) Model Clauses, U.S.–EU Safe Harbor framework, Family Educational Rights and Privacy Act (FERPA), and the Canadian Personal Information Protection and Electronic Documents Act (PIPEDA), just to name a few.

 

Myth 4: I have to move everything to the cloud; it is an all-or-nothing scenario.

FACT:
  • While early cloud supporters proclaimed the cloud as the Holy Grail, no one really advocated fork-lifting your entire enterprise to the cloud over the weekend. Most implementations start with a hybrid approach, moving a single application, like email, and growing from there.
  • The hybrid cloud creates a consistent platform that spans data centres and the cloud, simplifying IT and delivering apps and data to users on virtually any device, anywhere. It gives you control to deliver the computing power and capabilities that business demands, and to scale up or down as needed without wasting your onsite technology investments.
  • Many companies are moving productivity workloads to the cloud; the path for each is different, and the time it takes for those migrations varies. We can help you move workloads such as file sync and share (OneDrive for Business) or email (Exchange) first, and then help you figure out the right long-term plan for more difficult or larger products.

 

Myth 5: Cloud Migration is too much for my business to handle

FACT:
  • When you start considering how to move petabytes of data to the cloud, it’s easy to see why some people think ‘going cloud’ is too big a challenge. We’re not going to tell you it’s as easy as pie. But you can be up quickly for agile initiatives and calculated data migrations.
  • We’ll help you every stop of the way with information and tips on firewall configurations, reverse proxy requirements, identity options, migration possibilities, and a phased approach to hybrid setups. We’ve created several paths you can follow, ad in more cases, you can use your existing tools and processes.

 

Myth 6: Corporate Spies, cyber thieves, and governments will have access to my data if it is in the cloud.

FACT:
  • This is a top fear many businesses have about the cloud. But it’s unfounded. Your IT team manages access, sets up rights and restrictions, and provides smartphone access and options. Your company remains the sole owner: You retain the rights, title, and interest in the data stores in Office 365.
  • We operate under several key principles when it comes to safeguarding your data:
    • We do not mind your data for advertising or for any purpose other than providing you services that you have paid for.
    • If you ever choose to leave the service, you take your data with you.
    • Privacy controls allow you to configure who in your organisation has access and what they can access.
    • Extensive auditing and supervision percent admins from unauthorised access to your data.
  • Strict controls and designs elements prevent mingling of your data with that of other organisations. Our data centre staff does not have access to your data. Additionally, we offer 99.9% uptime via a financially backed service level agreement. If a customer experiences monthly uptime that is less than 99.9%, we compensate that customer through service credits.
  • Microsoft is the first major cloud provider to adopt the world’s first international standard for cloud privacy. The standard establishes a uniform, international approach to protecting privacy for personal data stored in the cloud. It reinforces that:
    • You are in control of your data.
    • You know what’s happening with your data.
    • We provide strong security protection for your data.
    • Your data won’t be used for advertising.
    • Microsoft encourages government inquiry to be made directly to you unless legally prohibited and will challenge attempts to prohibit disclosure in court.

Myth 7: Skype and Skype for Business are one and the same.

FACT:
  • Skype that you use at home is great for a small number of users and is free thero_sfb_multi-device-image_645x346o use, unless you want to buy credit to make calls to landlines and mobiles.
  • Skype for Business lets you add up to 250 people to online meetings, gives you enterprise-grade security, allows you to manage employee accounts, and is integrated into your Office apps.
  • Skype for Business integrates with Office 365, boosting productivity by letting people connect on their terms. Employees can make and receive calls, give presentations, and attend meetings from on application – from anywhere – as long as they have an internet connection. For example, employees can:
    • Instantly see when someone is busy or available.
    • Start an instant messaging session by double-clicking a contact name.
    • Share a desktop during a meeting.
    • Invite outside partners to join a meeting via a full feature web conferencing experience.
    • Integrate video through a webcam for a call or conference.
  • With Skype for Business, you don’t need to have a dedicated administrator to run servers or invest in additional infrastructure. We take care of all of it for you. As a part of Office 365, Skype for Business offers users new features, upgrades, and patches as soon as they are ready. Skype for Business and the consumer version of Skype can also be federated so that communication is possible between platforms. Skype for Business service is supported around the clock. Of course, your IT team will have to manage settings, access, and security, but we handle the rest.

Myth 8: Email isn’t simpler in the Cloud.

FACT:
  • By moving your business email to the cloud, you can rest easy knowing that the experts who created the software are taking care of the tricky maintenance, while your team keeps control of your company’s capabilities and how your employees use features. You can spend more time on the core operations that build your business value rather than keeping up with persistent hardware maintenance.
  • Software updates and fixes are delivered automatically as soon as they are released, and Exchange Online is always first in line for updates. Although the management and updates are fully automated, you are still in control when it needs to be with the Exchange Admin Centre.

Myth 9: Continuously updating Office 365 will break my critical business applications.

FACT: We know that a lot rides on your employees being able to use business-critical apps and add-ins with Office. We are committed to compatibility with the tools you use every day with Office 365. We do that by:
  • Offering the same worldwide standard of desktop applications with the familiar tools you know and love, including Word, PowerPoint, and Excel.
  • Working hard to ensure that even as we update Office on a regular basis in the cloud, we ensure our updates do not impact areas that would impact other software applications. For example, for the past 24 months, monthly releases of Office 365 have not resulted in object model or API changes. If your business critical solutions work with Office 2010 or Office 2013 today, chances are they will work with Office 365.
  • Collaborating closely with leading software vendors, and providing them tools and early access to ensure that their solutions that work with Office continue to work with Office 365.
  • Helping you avoid compatibility issues with guidance and best practices for update management and development.
  • Enabling side-by-side installs of Office 365 ProPlus and your older versions of Office, which gives you the time needed to remediate any issues.

 

Source: Microsoft

Enterprise Mobility – Fulfil the Promise, Avoid the Pitfalls

mobile_cartoonWe see the pattern time and again. “Everyone” agrees that a new technology will transform business and you must be part of it or risk being left behind.  Businesses caught up in the hype rush to implement optimistic and poorly thought out projects.  Something goes wrong resulting in massive costs and reputational damage.  Finally, we take a more cautious and realistic approach to building the new technology into our business models and the technology starts to meet its early promise.

Such is Enterprise mobility.   The notebook, smartphone, and broadband wireless are enabling technologies, allowing us to break away from the office and have accelerated a transformation of how we think of the workplace.  Benefits from anywhere access to data and tools include a boost to productivity, improved customer service, and flexibility for employees. The concept appears to be a clear win/win with evangelist’s spruiking the undeniable benefits, but often ignoring the security implications.  We are a long way down the road to mobile maturity, but we are not quite there yet.

Early mistakes were made, and records show it takes time for an industry to adapt and learn.  In 2006, millions of health records in the US were exposed from a stolen laptop, resulting in a class action that cost tens of millions on top of the privacy and identity theft issues.  Lesson learned?  Perhaps not, try googling breaches from lost and unencrypted notebooks and smartphones and you will find the same mistake made time and again.

A variety of risks and mistakes continue to be documented.   Just this month a Chinese firm admitted to installing hidden software that sends the users text messages, call log, contact list, location history, and app data back to Chinese servers – software that may have been preinstalled on as many as 700 million phones!  What happens when such a phone is brought inside your corporate network as a BYOD device?

So how to reduce these risks?  Any solution must take into account the diverse range of devices, technologies, and user awareness that is present across an organisation as well as trade off security for ease of access and use.

Attempting to implement a specific solution for each disparate device, scenario, and individual is prone to failure and akin to wack a mole.  Instead, a multilayered approach can work with a fundamental focus on data, authorisation, and compliance rather than the device or specific risks.  Applying broad strategies that can cover unforeseen risks as well as known risks – make the system as intrinsically safe as practical.  Build a consistent, secure environment across devices and applications, and quarantine and protect that environment from unregulated parts of the system.

The most successful solutions will allow a company to maintain control of its data while not getting in the way of work.

Elements of a Mobile Security Strategy

In order to develop a robust mobile security strategy, consider a wide range of technologies and techniques, then pull them together to meet your security objectives and implement a consistent strategy.

Manage the Human Factor

The greatest vulnerability in any corporate security system are its people.  People want to get their job done, not fight with the tools and access they need to do that job.  Where security gets in the way, then they will work around it and introduce new risks.

Staff will use weak passwords that are easy to remember.  They will click on random email attachments with no thought that they may be a virus.  They will help the nice man, purportedly from Microsoft, remotely take over their PC to fix the “computer problems” he generously rang them about.  They will enter their credentials into a fake website, just, because.  They will jailbreak their phone.  They will let little Jonny install a game that comes with a special payload of malware.  They will not do these things to harm their company, boss, or IT staff, but rather because their focus is on their work and because they don’t have the knowledge or awareness to know better.

People don’t like to feel needlessly constrained in what they can do with their tools, or even which tools they are allowed to use, and that is doubly so when they are using personal devices for work.  Security policies will be more effective if they take into account user expectations and behaviour.  Enforce password policies but perhaps also support alternative and easier authorisation methods, say fingerprint access.  To share files, the standard corporate fileserver may not cut it for staff used to using Dropbox or OneDrive, so perhaps look at cloud options that can be implemented in a secure way.  Solicit requests from staff about current pain points and any tools or functions they feel are missing and work out a way to help them out – with security integrated.

Work with staff to meet their needs rather than try to dictate from on high what staff must use.

Source: Microsoft Enterprise Mobility and Security Blog

Source: Microsoft Enterprise Mobility and Security Blog

Redefine “The Workplace”

In the world of enterprise mobility, the “Workplace” is now a collection of locations, devices, data, and communication channels.  Not all of these elements are under direct control of the corporate and edges to the corporate environment are necessarily blurred.

Defining a mobile security environment then necessitates a focus on defining and monitoring flows and storage of information and identifying where boundaries are set and how to control movement of data across those boundaries.

Set and Enforce Mobility Security Policies

To limit risks of unauthorised access, a strict mobile security policy is essential.

The basics include enforcing a lock policy on devices, and device encryption.  You can also set compliance requirements for devices such as ensuring patches and anti virus are up to date, and check that the device is not jail broken or has risky software installed on it.

To implement such policies you need some control over the device, and that can cause issues in the case of BYOD where policies may conflict with personal use of the device, or where enforcement of compliance may not be realistic on the device.

access_policy

Application Control

Application control aims to reduce to risk posed by security flaws in particular applications.  At a basic level using a white or blacklist of approved applications and versions might be enforced alongside centralized provisioning and management.  More advanced methods that have emerged in recent years include security and management protocols baked into applications.  Again, in many cases where staff are using personal devices, enforcing application control can be a point of conflict.

ems_notifications

Protect Data in Transit, Layer Security

Mobile devices may access corporate resources across a changing variety of network infrastructure including public and unsecured wireless hotspots.  Ensuring traffic that transits across such networks is secured by appropriate encryption protocols is essential.

Some small businesses allow remote users to login work machines directly with the windows RDP protocol.  Don’t.  While RDP is generally secure, you only need one bug or weak password and you have a breach.  Require a VPN to carry your RDP traffic (remember CVE-2012-002 which allowed RDP servers exposed to the internet to be compromised.  You don’t want that.) A VPN may itself have bugs or other vulnerabilities, but two reasonably independent layers are much less likely to be penetrated than one.

BYOB vrs CYOD

In some environments Choose your Own device rather than Bring your Own Device is a popular trade off where policy allows staff to choose from a wide range of acceptable devices that are owned by their company rather than allow an open slather approach.  This approach can reduce the range of potential vulnerabilities and will reduce conflict over acceptable use of the device by maintaining hardware ownership within the company.

Protect Documents at the File Level

Rights Management technologies can be used to secure access to company documents by default, and to restrict movement of those documents outside of a secured environment.  At a basic level that means encrypt all documents and only unlock those documents after appropriate authentication is applied.  This means if a document is accidently emailed, or a device with the documents stolen, the document will still not be accessible.  It also means that if authorisation is revoked for a user, they lose access to corporate information, even if that information is still on their personal devices.

Restrict Printing, Emailing, or Copy/Paste of Corporate information

Following document encryption, the potential exists for decryption to occur at a whitelisted application level where the approved application can also restrict the ability to copy or print sensitive documents.

Encrypt Everything

lockedWhole device encryption is slowly becoming standard on smartphones (much to the highly publicized concern of some government authorities) and is a must to ensure data on devices can not be read, even if an unauthorised person gains direct access to the devices file storage.

Technology such as bitlocker has been available for some time and is underused on notebooks and desktops.  Trusted Platform Modules (TPM) is now quite common on business focused laptops and allows for simple access with bitlocker enabled on a notebook.

File level encryption may be more appropriate where personal devices are in use and to better protect documents that may be transmitted to other users or to remove file servers or cloud storage.  Using both technologies is reasonable and largely invisible to the user.

Use Multi Factor Authentication

Typical authentication requires knowledge or access to a single authentication key, such as a password or a physical device.  The problem then is when that access method is discovered or becomes accessible to an unauthorised person, then the attacker is straight in.

Two factor authentication requires access to two different categories of authentication keys, selected such that if one authentication method becomes exposed, it remains unlikely that the second method is also exposed so the attacker still cannot gain access.  For example, an online portal might be secured with a password but also requires access to a separate security fob that generates a changing one time password.  If the set password is exposed, an attacker still cannot log in without physical access to the security token.  For highly sensitive information, additional authentication requirements might be added.

The main drawback of multi factor authentication is the additional time and nuisance of entering two or more authentication keys every time data is accessed.  This issue should be managed by considering the value of the protected content and apply realistic policies to find a reasonable balance.  For example, when accessing data at an online portal from a particular device policy may require the password entered on every access (or after a short timeout) but the changing security token to be applied only once per day when access can be verified to be from a previously authorised device.

Push Notification for Microsoft Authenticator app on iOS

Push Notification for Microsoft Authenticator app on iOS

 

Device Access Control

Maintaining a registered list of approved devices (corporate and personal owned) can allow for access to be restricted to those devices, reducing the issues with an open slather approach.

Partitioning Personal and Corporate Data

When accessing corporate data and systems on personal devices, isolating corporate from personal data and usage can help maintain privacy for the user and secure corporate data from unsanctioned access or copying.  Access to corporate data can then be restricted to approved applications and allow a remote wipe function on corporate data without touching personal data.

Use Data Analytics and Context – Conditional Access

Increasingly intelligent authorisation systems can be used to detect and block unusual activity and tailored to complementary systems that are in use.

Fred might log into a company cloud storage in the evening for an hour or two accessing from his home internet originating from an IP address in Brisbane.  He might access the same information the following day from a wireless hotspot while at lunch, also in Brisbane.  An hour later, he tries to access the information from a IP registered in Melbourne and different device.  That may raise a flag and an advanced authorisation system might block that one and lock his account in case it’s an attempt using leaked credentials.

ems_conditionalaccess

Use an Enterprise Mobility Solution

A range of enterprise mobility solutions are available from major IT corporates and are under rapid development.  A number of packages have reached a level of maturity and include many of the technologies discussed in this article along with excellent reporting tools and risk management systems.  They are worth considering as an excellent starting point and core component of your mobile strategy.

Enterprise mobility solutions can be assessed by features including:

  • support a wide range of devices, environments, and applications.
  • include threat detection based on known attacks and vulnerabilities, and abnormal behaviour.
  • wipe all corporate data from a device when an employee leaves an organisation
  • set policy restrictions on, for example, restrict the ability to cut and paste content to unprotected files.
  • prevent access on devices or in environments that do not comply with security policies, such as jail broken phones, and lock or remove data on devices that become non-compliant.
  • provide a end user based a self-service portal for users to enrol their own devices
  • include single sign on so once authenticated, multiple applications and sites are accessible.
  • support bulk deployment tools to enrol devices, change rules, and install applications on large scale.

Bringing it all Together

Enterprise Mobile Security requires wide-ranging integration of technologies, procedures, and policies and is one of the toughest and most important systems to get right in your organisation.  It requires a good knowledge of your business but also of the technologies available.

My advice is to keep your eye on the big picture and continuously weigh up risk against productivity while reviewing the systems effectiveness, and feed those reviews back into incremental improvements.  The more traditional rigid approach of ticking boxes and believing you are secure is a sure path to failure.

For smaller organisations, draw on the experience of external experts, but don’t buy into a prepacked, “standard” solution (there is no such thing).  Work with consultants to help them understand your business, and work with them to tailor the technology and policies to your needs.

Further Reading

Cyber Security Report

Pre-installed Backdoor On 700 Million Android Phones Sending Users’ Data To China

Why stolen laptops still cause data breaches, and what’s being done to stop them

Microsoft EMS Blog

 

Protect your Business and Family with OpenDNS

opendns_logoOpenDNS offers a suite of Internet filtering and security features targeted at levels from the home environment to corporates.  The service works by replacing the Internet’s standard address book system (DNS) with a custom service which then allows you to limit, track, and manage access to the Internet and to help protect your staff and family from malware, phishing, and inappropriate content. 

OpenDNS was bought by Cisco in late 2015 and is now being integrated with their extensive product and service portfolio.  With Cisco’s backing, the service will likely continue to grow in the corporate market and at this stage it appears that cisco will continue to support the home and small business markets.  Basic filtering and lookup services remain available at no cost with more sophisticated services available for a fee.

The nature of the service means that security can be set up and managed at a single point where all new devices plugging into the network are automatically included, simplifying administration and reducing or removing the need to load custom software onto each device.

Many of our clients have migrated away from device specific software lockdown tools to OpenDNS in an effort to reduce the impact of device specific tools on performance, and to reduce administration costs.  The service is not perfect and is best used as part of a layered strategy, but it is a low impact, high value service that we recommend for most sites.

How can OpenDNS help your Family or Business?

The OpenDNS service can help with:

  • Protect families and employees from adult content on the internet by blocking these sites.
  • Improve productivity and conserve bandwidth by blocking time wasting sites, or blocking all sites except for a group selected by you.
  • Reduce the risk of malware infection by blocking sites that are known to contain malicious software and exploits.
  • Block “phishing” sites that try to trick you into giving up your identity and login information.
  • Improve the responsiveness of web browsing in cases where ISP DNS servers have poor response times.
  • Improved confidentiality, Integrity, and availability of the DNS service by implementing improvements to standard DNS protocols such as support for CurveDNS and DNSCrypt.
  • Gain visibility of your internet traffic with access to logs of sites visited by your users and sites that were blocked.

How it Works – Take control of the Domain Name Service (DNS)

When you enter the name of a web site like www.computeralliance.com.au into your browser, you are asking something like “Please show me the Computer Alliance web site”.  Easy you might think, but one little problem crops up: your browser and computer may have no idea where the Computer Alliance web site is located.  Oops.

The web “address” you type into a browser is more akin to a name than a street address.  A comparable request to a friend might be something like “Please head down to Computer Alliance and check out their stock and pricing for me”.  If your friend knowns our physical address, all good, otherwise they have no idea how to reach us.

So what does the friend do?  Well, they ask!  You are the closest source who might know the address so they ask you first, and if you know, then problem solved, but if you don’t you might then look it up for them using a more comprehensive address book.

Internet addressing works in a similar way, using what we call the Domain Name System.  DNS is a protocol that lets computers look up domain names like www.computeralliance.com.au and find out their numerical internet address (something like 103.4.238.104), the one more like the street address that you must have in order to contact the site.

Similar to the idea of your friend finding the address by checking their own memory, and then asking you, DNS works on a hierarchy where your computer will check to see if it has recently looked up the address and already knows it, and if not will then ask its closest DNS service, probably your modem/router.  If that service doesn’t know either, it will know about a more comprehensive server further up the chain to ask.  Eventually the address will come back down the chain to be passed back to your browser.

dns

The address of your local DNS server is normally assigned to your computers automatically and in most cases will point to a DNS service on your router or local server.  That server will then pass queries on to your internet providers DNS servers.

With OpenDNS as your DNS provider, replacing your ISPs DNS servers, you can still receive the full range of usual address lookups, but also take some control over the responses to manage access to the address.  In this way you can make use of OpenDNS systems that identify and categorise sites by content and by security risk, and then automatically block sites that you don’t want accessed.  You can also select specific sites to block, or block everything and only allow specific sites.

How to Set up – Manually Set DNS

You can make changes to which DNS server your devices use when they don’t already know an address.  One method is to change the entries for DNS servers on all individual client devices, such as PCs, but it is normally easier and more secure to only change the local DNS server on your network that all other devices rely on for their initial DNS query.

Your router may use DHCP to automatically give out local addresses and its own address as a DNS server to all devices attached to the network (that’s the standard setup for most environments).  Then all you need do is change the routers DNS settings that look to external DNS servers so that it in turn looks to OpenDNS servers for its DNS queries to pass back to your devices.

Set DNS on Router

Limitations – Bypassing OpenDNS

You may see an evident flaw in this method.  If a user manually changes their computers DNS server settings to some alternate (for example Googles DNS servers) rather than the router, they may be able to bypass filtering.  This can be a problem and is one reason OpenDNS should be used as one element in a multilayered security strategy.

In environments that are heavily locked down, using best practice security measures, users (and any malware that takes control of a user account) will not be able to change local DNS settings and attempts to query other servers may be blocked.  Even in those environments there may be possibilities to bypass OpenDNS and a range of other systems may be used to detect or block those bypasses.

The potential of bypassing OpenDNS depends on the intent of users, the knowledge of users, and the security setup of the environment.  In most environments the risks of inadvertent or intentional bypassing is minimal.

Setting up Blocking a Policy

The level of control available varies with the level of service chosen, but all levels allows you some broad blocking options that is generally appropriate for most sites.  You can select a typical collection of settings with a single button, or take more control and manually set which categories to allow or disallow, and even go to the detail of blocking specific sites.

policy_opendns

Various reports on Internet activity are available to help you keep an eye on sites that users are attempting to access.  This can sometimes show up possible problems on your network where a certain application may continually attempt to access a particular site (such as malware trying to talk back to its servers or to infect other computers).

queries_opendns

Blocking Malicious Sites

OpenDNS receives reports of malicious activity related to web sites and adds those sites to its blocking lists.  A default security policy is applied in addition to the general blocking policy specifically to deal with these threats.

security_settings_opendns

If a user attempts to go to one of those sites, or as often the case, if a computer is directed there by malware, then they will be blocked and so protected from malware that may be present on the site.

In cases where OpenDNS may get it wrong or the user may feel they have good reason to access a site, the interface allows users to submit a request to the site administrator to review the site and approve access.

opendns_block

Attempts to access that site are also logged so you can monitor if there may be an ongoing problem of malware infection with one of your computers.

security_activity_opendns

OpenDNS Active Directory Integration

Most business sites use Active Directory to manage users, devices, and their access permissions.  If your business has a server based environment, you probably use Active Directory.

By integrating OpenDNS with your Active Directory environment you can set up group and user based settings for OpenDNS.  That might mean, for example, that senior managers might have less restrictive access to various web sites where staff with less need to access the Internet might be heavily locked down.

AD Integration requires two components:

1. Virtual Appliance

  • Runs in a virtualized server environment (Hyper-V, VMWare)
  • Forwards local DNS queries to your existing DNS servers and
  • Forwards external DNS queries with non-sensitive metadata to the OpenDNS service.

2. Connector

  • Runs in your Active Directory environment,
  • Securely communicates non-sensitive user and computer login info to the Virtual Appliances.
  • Securely communicates non-sensitive user and computer group info to the OpenDNS service

The Process of Setting up OpenDNS

Given that the basic OpenDNS service is free and straightforward to set up at the router level, I usually suggest simply signing up and giving it a go.  If it causes issues for you, its quick to revert to your prior DNS settings.

Once you get a feel for its possibilities, its time to look at the more advanced options and see if a paid plan is worthwhile.  To take it further in a business environment you should also review your site security (usually a good idea anyway) and consider how OpenDNS is best integrated.

Much of this work can be done by tech savvy senior staff or home users, or for advanced setup options you might want to consult with our technical staff at ABT (Alliance Business Technologies).

Further Reading

OpenDNS

Active Directory Integration

UMBRELLA BY OPENDNS!

DNSCurve Protocol

Guide to Personal and Small Business Backups – Storage and Tools

Rack_StorageThis article will examine options for backup storage and tools, provide advice on how to choose between them, explain how they can be effectively employed, and give examples of common implementation pitfalls.

Prior articles have worked through the high level conceptual framework and technical concepts that relate to backup systems.

Related Articles:

01 GUIDE TO PERSONAL AND SMALL BUSINESS BACKUPS – CONCEPTUAL FRAMEWORK

02 GUIDE TO PERSONAL AND SMALL BUSINESS BACKUPS – TECHNICAL CONCEPTS

Backup Storage Infrastructure

Your backup system will copy all your important electronic Stuff from one or more storage locations to some other storage location.  It is all about storage, so it naturally follows that the choice of the storage used for backups has a major impact on the effectiveness of your system.

To help select the type of storage that best suits, you might review the desirable attributes of a backup system that I outlined in our first backup article and consider how selecting storage types will influence attributes of the final backup system.  As a reminder, the attributes were: simple, visible, automated, independent, timely, cost effective, and secure.

Hard Disk Drive (HDD) Storage

HDD_StackHDDs have been the mainstay of IT storage for decades.  The technology is slowly being replaced with SSDs but are still the most common primary storage and are one of the best options for backups.

HDDs allow random access, are fast, reliable, cheap, and generally have much going for them.

Internal Hard Disk Drives

Internal HDDs refer to the HDDs built into PCs and devices.  There are a number of options to incorporate internal drives into your backup system, though for the most part they play an incomplete role.

Where your PC has a single internal HDD, it will be of limited use as a backup drive.  In general, you don’t want to set a backup to the same physical device as the source files as it fails the test of independence – if the drive dies you lose source and backup files at the same time.

Internal_HDDThere are minor exceptions to this rule.  You could maintain copies of older and deleted files on the drive to offer limited protection against accidentally deleting or overwriting files.  You could direct an image backup to itself in circumstances where external storage may not be always available but you want frequent and regular automated snapshots available (remember to exclude the image location or recursion will run you out of space!).  In that case you would move the files to external storage when available allows.

Some PCs have more than one internal storage device, for example, you might use one fast HDD or a SSD for Windows and program files (for the speed) and a cheap, perhaps slower mechanical HDD with large capacity for other files or as a dedicated backup drive (for the cheap space and low cost).  A second drive adds options and given the minimal cost, I suggest adding a second internal HDD to a PC specifically for use in backups.

With a second internal HDD you could create a system image backup of the primary HDD to the second HDD and if the primary HDD fails, your backup will (hopefully) but still available on the second drive.  That design lets you schedule and run the image backup with certainty that the destination will be available (reliable automation), and provides some independence between source files and the backup, but it is still not great.

I have seen clients use this technique alone as their backup system only to lose their data when a power surge, virus, theft, or other event destroys the data on both drives.  The design is vulnerable to these significant risks because it still largely fails our test for independence, where the backup destination should be as far removed from the source files as possible.

To improve the independence of the destination drive, you might use an internal HDD in a computer on the same network rather than in the same machine by setting up a network share.  That’s a little better in terms of independence, but again it is not ideal as certain events can still destroy the data on both drives.

RAID stands for redundant array of independent disks and is another way to use multiple internal drives to reduce your risk of data loss.  One of the simple types of RAID is called a mirror.  A mirror uses two drives in an array with the system automatically mirroring any writes to the disk onto both disks in realtime.  In terms of operation, it looks like you are working with a single disk, but any time you save a file, it will be available on either disk.  If one disk fails, you won’t lose any data and in fact the PC will keep working like normal.

There are many other types of raid, some allowing for protection against one or more drives failing, but also some where if any drive fails, all your data will be lost.  You can set up a RAID array on a PC but I rarely suggest that as a good option as its cost/benefit tends to be marginal against other options.  The technology is most commonly used for server systems and storage arrays that use the hardware best suited to supporting raid arrays and in those environments I consider RAID to be essential.

Never confuse RAID with a complete backup solution; I have come across some spruikers who convince people that RAID is some magic technology that fully protects your data, never true.

The most common way to achieve a high level of independence for a backup system in a home or SME environment is to use multiple external HDDs.

External Hard Disk Drives

External HDDs are the bread and butter of home and SME backups.  They are awesome, and you should buy some!

WD_External_HDDThere are two basic types.  The physically larger drives, often called desktop HDDs, are 3.5” in size and will need a separate power supply (until USB-C drives become common).  The other type are physically smaller, often called portable HDDs, are 2.5” in size and can be powered from USB ports.  Value for money and for large capacity, the 3.5” drives are better with the smaller drives easier to cart around.  Either are fine for backups.

External drives come with various connectors.  The most common is USB.  Be aware that USB 2 drives are limited to about 35MB/s transfer rates, due to the limits of USB2.  Practically all current drives are USB 3.1 which allows for faster transfer, limited by the physical speed of the disk.  You will typically get 100+MB/s with USB3 drives so backups take much less time.  In terms of our preferred characteristics, “timely” means go with USB 3, though using an old USB 2 drive is fine as long as backups can still finish in a timely fashion.

You can get away with backing up to a single external drive, but your risk of losing data will be much higher than using two or more drives.  If you leave a single external HDD attached so it can take backups at any time, it may as well be an internal drive with the same vulnerabilities.  A single drive that you plug in only when backing means you need to plug it in manually every time you want to back up.  If you get lazy and leave the single drive plugged in, you will find a virus, power spike etc will kill your backup and source files at the same time, and you are stuffed.   If you don’t get around to the hassle of plugging it in to backup for a long time, then your PC HDD will die with all recent files lost.

Allocate at least 2 x external drives to your backup system and preferably three or more.  One can stay attached so scheduled backups work without thinking about it, and every now and then you should swap the attached drive with one stored elsewhere.  If you can afford a third or more drives, don’t swap them in a sequential cycle, leave one drive you swap in much less recently to allow you to keep some backups for a longer period on those and to reduce the risk that damaged files might be overwritten across all backups.

USB Pen Drives

Small, light, reliable, and increasingly large and fast, USB pen drives can be used as an alternative to external HDDs for backups.  At time of writing, they tend to be slower and smaller at a given price compared to a HDD, but where your backup needs are modest, a pen drives may do the job nicely.  Use them the same way you would a external HDD.

Solid State Disk Drives (SSD)

SSDs are slowly replacing mechanical HDDs in computing devices for their speed and (potentially) reliability advantages.  At time of writing they are still expensive for bulk storage and not generally recommended for backup solutions.  There are rare exceptions where their raw speed to shorten the period needed to run a backup makes them worthwhile, but for home and SME users, don’t buy SSDS for backups unless you have some special reason.

Network Attached Storage (NAS)

qnap_NASA NAS box is essentially a mini PC dedicated to file storage.  Most run a Linux OS with a web based GUI for setup and management and other features in the form of “apps”.  Their file storage can be accessed across your network, and even from outside your network.

You normally will buy a NAS without HDDs, and then populate the unit with size and brand you need.  It is important to match the unit with drives listed on the manufacturer’s compatibility list to ensure no glitches with the operation of the unit.  Drive manufacturers now make HDDs specifically for NAS units, like the WD Red range, and drives designed for NAS devices are normally your best choice over cheaper options.

RAID is a standard protocol used by NAS units, where all files are stored on at least two physical disks.  With this protection, if a drive fails, you won’t lose any data.  Remember when you add HDDs to a NAS with RAID redundancy enabled you will lose some capacity to allow the data to be replicated.

For home use, you might store some less important bulky files on the NAS given you have some protection with the RAID only (eg movies for media streaming), and additionally use the NAS as your primary backup device for image backups and/or file mirrors of your critical data.

If you buy a two bay NAS and add 2 x 4TB HDDs, you will only have total space of about 4TB available (a mirror), with three drives of 4TB you would have about 8TB, and similarly with 4 x 4TB about 12TB.  Also remember than drive manufacturers use a generous way to calculate capacity, so the NAS will report a little lower capacity than you might expect.

Some brands, such as Netgear, allow you to add drives as you need and have the available capacity automatically increased without need to wipe and recreate the array.  You can start with a 4 bay unit with 2 HDDs, then add a third and fourth as needed.

NAS units are attached to your network with a standard network cable and can be located in another room, or building, from your main devices.  They can be powered up and down remotely using wake on LAN commands.  They are excellent for automated backups and can act as a central backup location for all your devices.  For NAS units containing critical data, adding a small UPS and/or a surge protector is a good idea.

The main drawback of using a NAS as your only backup is while it is not physically attached to your devices, it is still prone to some of the events that could destroy its data and that of the originating device at the same time.  Power surges, theft, and some viruses are common risks.  One way around that issue is to rotate external HDDs attached to the NAS to take data from the NAS offsite.  You can also reduce the risks by using certain techniques, such as network passwords to prevent a virus that has access to your other PCs from accessing the device.

There are various limits and risks to using a NAS in your backup system but that can be a useful element in any backup system and I recommend them for most designs.

Cloud_backup

Cloud Storage

Let us all pause for a moment, and be thankful that our government vastly accelerated the rollout of massive bandwidth services by building an awesome NBN so we now lead the world in connectivity.  We can now easily work from home, backup everything into the cloud with a click, and offer our professional skills to a world market.

Oh, wait, sorry, delusion setting in again.  Happens when you spend too much time in this industry.  This is, after all, the Australian Government.  Let’s instead spend billions on roads so we can allow more people to move from A to B while producing nothing except pollution.  That’s productivity for you, Australia style.

Back to reality.  Cloud Storage refers to storage capacity you can access through the internet, normally third party storage but sometimes your own.  It’s a big deal nowadays as industry behemoths fight to get you on their cloud.  In theory, it’s a great way to back up your stuff.  Unfortunately, there is a big gotcha, the bottleneck that is your internet access.

You most likely have a low speed ADSL connection with upload speeds of under 100KB/s (uploads are much slower than downloads with ADSL).  That means it takes at least a few hours to upload a single gigabyte of data, while clogging up your internet connection so it’s barely usable for anything else.  Cloud backups are viable with slow connections, but limited and must be managed carefully.

So what is a cloud backup?  Nothing fancy, it just means that instead of using local storage, like an external HDD, you can use Cloud Storage to save your stuff.  It’s a great idea because the instant you have completed the backup, those files are offsite, and depending on service used, protected across multiple sites managed by professionals who are probably less likely to lose your Stuff than you are!

If you are one of the lucky people who enjoy a 100Mb/s or more upload service, great, then you are probably able to backup everything to the cloud.  For the rest of us with a low bandwidth internet connection, cloud backups are best used in a targeted way.  In other words, back up your small and important files rather than everything and use more traditional means alongside cloud backups.

“The Cloud” is a relatively new phenomenon and service providers are still working out viable business models.  New services appear, and disappear on a monthly basis.  For the most part, I suggest looking at services provided by the big guys such as Microsoft, Amazon, EMC, Google, and similar.  I expect most of the small players to be absorbed or disappear.

All we need to send backups to the cloud is available capacity.  It is not essential to sign up to a service that is specifically targeted at backups (though there are advantages with some designs).  The most common service available, and one you may already have access to without realising, is OneDrive, Microsoft’s cloud storage service.  If you have an Office 365 subscription, you will have access to a practically unlimited storage capacity on Microsoft’s servers that you can use to move files around, share stuff, and backup stuff.  OneDrive is not designed as a backup solution, but it can be used as part of a backup system where it sets up a folder on your PC and all files saved there are automatically uploaded to your cloud service.  Great for documents, not so viable for large files such as video or image snapshots.

Cloud storage services specifically developed for backups are also available and are more appropriate in a business environment.   Some, like Mozy (EMC) have been around a while, and most recently the other majors are aggressively moving into this market with Azure (Microsoft) and AWS offering various solutions.

Cloud backup probably should form part of your backup system, and in some cases can form the core of your design.

Other Storage Options

Tape Drives were, for many years, the go to backup option for business.  Tapes were cheap and relatively reliable but needed to be written to in a linear way.  I won’t go more into the details of tape drives, rather than simply say, don’t use tape drives.  On a small scale tapes drives are messy and unreliable compared to other options.

SAN arrays are like NAS units but further up the food chain.  For medium and larger business, a SAN in your backup system makes sense, often including replication to offsite SANs at a datacentre or a site dedicated to disaster recovery.  If you need this sort of system, you probably have your own IT people who can setup and manage and they are a bit beyond the scope of this article.

Others?  Yes there are even more options, but I think that about covers the most common options.

Backup and Archiving Longevity

LongevityI once found a decade old stack of floppy disks, my primary backup store during my Uni days.  I went through all of them to make copies of old documents and photos and was surprised to find almost half of them still had retrievable data.  At that age I expected them to all be dead (Verbatim, quality FDDs!).  There was nothing critical on them, but it’s an interesting lesson, you can’t afford to set and forget any data.

Remember when writable CDs emerged?  The media were reporting how this awesome optical technology would allow data to be archived for least 100 years.  Only a few years later we had clients bringing disks in to us after failing to retrieve “archived” data with the disks physically falling apart.

Will your data be there when you need it?  The failure rates of modern storage hardware is low, but physical stuff never lasts forever and a realistic lifespan can be difficult to predict.  It is likely that the external HDD you have sitting in the cupboard for the last five years will power up when plugged in, but the longer you leave it, the more chance that the device or data on it will be gone.

Keep any data you may need on newish devices, and replicated on multiple devices.  When that old external HDD is just too small to fit all your backups, perhaps keep it with that old set of data on it and chuck it in a cupboard but copy at least the critical files to a new, larger device as well.  Cloud based storage may be an option for long term storage, but trusting others to look after your stuff also introduces risk, so ensure you manage that risk.  Hint: free is bad and companies (especially start-ups) and the data they hold can disappear with little notice.

If you produce too much data to cost effectively maintain all the data on new devices, give careful thought on how best to store “archived” data and weigh the risks of data loss against cost of storage.

Backup (Software) Tools

toolsThere are a large number of software tools that you can use to build a backup system.  Do not fall into the trap of assuming that throwing money at a product will lead to a desirable result, though at the same time don’t rule out a high cost commercial option where it’s a good fit.

Google is your friend.  Look around online and check what the professionals use.  Making use of unpopular, emerging, or niche products is sometimes OK, but only adopt such tools where you see substantiative advantage in your environment.  In general, go with what everyone else uses to get a particular job done.  This will reduce your risk.

Consider the attributes of a backup system that I outlined in our first backup article and relate them to outcomes possible with the various tools: simple, visible, automated, independent, timely, cost effective, and secure.

Block Level (Image) Backup Tools

A block level backup tool is able to copy all data on a storage device, including open and system files, so you can be sure to get all your files stored on a partition or disk.

Windows has a basic imaging tool built in, though I’m not a fan of its interfaces limited features.  There are some better free tools available, such as AOEMI Backupper, and a wide range of paid tools such as Acronis and ShadowProtect.  The free tools such as Backupper are adequate in many situations, though their features tend to be more limited and you may need to use supplementary tools when handling related functions such as retention and cleanup.

With any block level tool you intend to use, look for features including:

  • Support for Full and incremental backups (and differential if you need it, but you probably don’t)
  • Automate scheduled backups.
  • Options to encrypt and/or compress backups.
  • Process to verify condition of backup archive (test if files are damaged)
  • Fast mounting of image files.
  • Replication (copy images to additional locations)
  • Retention (automatically clean up older backups to manage space based on age and/or size basis)
  • Ability to exclude specific files or folders. This is very handy, and not offered with all image tools so pay particular attention to this one.
  • Bare metal restore to different hardware.
  • Support for “continuous” backups and related consolidation and retention (advanced feature where frequent incrementals are merged into the archive and older files stripped out to manage space – excellent when uploading images offsite via the Internet)
  • Deduplication (useful for larger sites – eg if you back up a dozen windows desktops, but only store one of each system file instead of 12 to save a lot of space)
  • Central Management (manage backups across multiple devices from a single interface. Important for large sites)
  • Ability to mount and run image of backup files in a VM.

You probably don’t need all of these features, and some can be implemented outside the program.  For example, you could use robocopy and windows task scheduler for replication.  Don’t just tick off features, go with a product that does what you need reliably.

There are many implementation tricks that may not be obvious.  A common possible issue is when you create an image on an attached HDD, then swap the drive, you will at best end up with a different set of backups on each drive, maybe acceptable but not ideal.  Instead it is often better to create the archive in a location that’s always available, such as a NAS or internal HDD, then replicate the entire set to the external drives.  Think through what you need to achieve and make sure the tool you select can support those outcomes.

A block level backup tool should be used in nearly all backup systems.

File Level Backup Tools

A file level backup tool can be any software that lets you copy files.  The Windows File Explorer could be consider a file backup tool.  To be more useful however, we need to look at additional features such as

  • automation,
  • compression,
  • encryption,
  • incremental backups,
  • retention,
  • and others depending on your needs.

File level backups can be very simple, quite transparent, and very reliable.  This type of backup process is excellent to backup discrete files where you are not concerned about backing up locked files or keeping old versions and deleted files.  They can also be used as a replication tool to copy image backups.

My favourite file level backup tool is one you probably already have, a program called Robocopy that is built into windows and accessed by the command line.  Its quite a powerful utility that can be automated with use of a batch file and the task scheduler.  If you are not comfortable with the command line or creating *.bat files, a better option many be one of the many free graphical interface based utilities, or a GUI shell for Robocopy.  Rather than list the many options, I suggest using google to find recommendations from reputable sources (try searching google for “Robocopy GUI”).   There are many other similar tools, Fastcopy is another we occasionally find useful.

File level tools may be adequate for a very basic backup system, where you don’t care about backing up windows, applications, or locked files, but for the most part they should be used alongside block level image backup tools.

Batch Files and the Task Scheduler

A batch file is a simple text file you can create in notepad saved with the .bat extension in place of .txt.  If you double click the file, windows will read the file one line at a time and try to run the command listed on each line in order.

A batch file can be used to automate file level backups or replication when you set to run on a schedule with the windows task scheduler.  For example, if you typed a line something like robocopy d:\MyFiles f:\MyFiles /e /purge and ran it within a batch file, you could mirror your files to a different drive.

If you get a bit more creative you can use the technique for many useful functions including backup systems that retain older and deleted files, and to manage the file retention of image backups.  You can also look at Powershell and other scripting options to implement more advanced backup designs.

Emergency Measures

Designing a backup system is all well and good, but if its too late or your backup system has failed, is there anything you can do?

Sometimes.

Deleted files on a mechanical hard disk can often be retrieved by file recovery tools such as Recurva.  On a SSD you may be out of luck as with modern SSDs the old files are actively scrubbed shortly after being deleted.

Copies of files may be located in places you would not expect, cached files, online services.

A failed mechanical HDD will usually contain data that can be retrieved.  Data recovery experts may be able to help, however costs are often in the $1000s.

If you look to have lost important files, leave the device powered down and ring us.

Bringing it all Together

This third part of our Guide to Personal and Small Business Backups outlined the Storage and Tools commonly by Backup systems.  Prior articles have covered the high level conceptual framework around which you can build an efficacious backup system, and many of the technical concepts you need to develop assess an appropriate backup design.

Our final article in this series will get to the nitty gritty by presenting and explaining solutions in detail as they relate to common home and small business environments.



USB goes Full Circle with USB-C

USB-CUniversal Serial Bus (USB) was developed in the late 90s to replace a mess of slow PC connections with a high speed, one size fits all plug and data transfer protocol.  All newish devices had the plug, it was good, and there was no real decision or gotchas to look out for when buying new devices.

A decade or two later, things are again a mess.  Incremental changes to USB have offered progressive technical improvements, but at the cost of modified plugs and sometimes questionable backwards compatibility.  Mobile devices using the small variants of USB ports, or worse still proprietary plugs, have diversified the cables in use and ensured X won’t plug into Y.  New connection standards such as DVI, Firewire, HDMI, and Display Port emerged to meet specific needs for what, at a basic level, is a demand for local bandwidth that could in theory be carried by one cable.

An effort is again underway to resist the evil forces of cable proliferation and focus on the holy grail, one cable to carry them all, resulting in USB-C (also called USB Type C).  This new type of connector is intended to replace the mishmash of USB plugs, and potentially other existing data plugs to, become the standard connector to attach any peripheral to a computer or mobile device and to provide enough electrical power to run many of them.

USB-C is now available.

Physical Characteristics of USB-C

USB-C on the left compared to Type A on the right

USB-C on the left compared to Type A on the right

The original USB connector was a spade shaped plug on your PC end (Type-A plug), and a squarish plug on the peripheral end (Type-B Plug).  Smaller variants are available for mobile devices.

They would only plug in one way round.  When taking random stabs at plugging a USB cable into the back of a PC, under zero visibility, and while fighting off the tech eating spiders and poisonous dust clouds, it would be impossible to get the cable around the right way.

Problem solved.  The new USB Type-C is a small, reversible connector about the same size as a micro-USB plug but a little thicker, and pretty easy to get the cable into.

The USB-C Plug was developed by the same industry group who define and control the USB data transfer protocols, but should not be confused those protocols.  USB-C can carry USB 3.1 and other USB standards but is not limited to those standards.

USB Protocols and how USB-C fits in

Superspeed_USB_10GbpsThe last major USB standard release was USB 3.1 in 2013, a relatively minor upgrade from USB 3.0 that itself dates back to 2008.  USB 3.1s main call to fame was doubling theoretical transfer speeds from 5Gb/s to 10Gb/s.

For reasons that I fail to understand, the body that handles USB recently decided to make a naming change:

  • USB 3.0 changed to USB 3.1 Gen 1 (5Gb/s) also known as SuperSpeed USB
  • USB 3.1 changed to USB 3.1 Gen 2 (10Gb/s) also known as SuperSpeed USB 10Gbps

For comparison sake, the older and still very common USB standard, USB 2.0, supports 480Mb/s of raw data.

The new USB-C plug is able to carry data complying with either USB 3.1 protocol or USB 2 (with adapters), but is not limited to USB transfer protocols.  USB-C can handle other useful high bandwidth data streams including video and network traffic, and will be able to handle future high speed protocols. This is done through the use of alternate modes, where the system can hand over control of certain pins within the connector to carry traffic defined by protocols unrelated to USB.

Where faster Data Transfer Speeds Matter

USB transfer protocols indicate a theoretical maximum throughput in bits per second of raw data.  There are 8 bits to a byte, so for USB 2.0 – 480Mbits/s = 60MBytes/s, but then we need to reduce this figure further to take into account “overhead”, essentially the part of the data stream that is used to get the information we need transferred across.  To test this, try to transfer a large file from a USB 2 HDD to your PC and look at the speed you get.  It will be no higher than 35MB/s, not 60, being limited by the USB 2 standard less significant overhead.

USB2_File_Transfer

USB 2 HDD transfer speed is capped by the USB Protocol

Your mouse, keyboard, and many other devices are just fine with USB 2.0, but HDDs and the occasional other USB device will work better with the faster protocols.

Modern Hard disk drives can transfer files at a much faster rate than 35MB/s so the upgrade to USB 3.0 can significantly increase transfer speeds.

USB3_File_Transfer

USB 3 HDD transfer speed is capped by the maximum speed of the physical disk

Marketing people confuse the issue by often quoting the USB 3.0 or 3.1 maximum speed on the box of many supported products, particularly HDDs.  This is not even close to the transfer rates you will get with the usual mechanical drives, as they will be limited by the transfer rate of the drive, not the transfer protocol.

Only the fastest of the modern SSD drives are now surpassing USB 3.0 speeds though given external drives are still mainly mechanical drives owing to cost, USB 3.0 is fine for now (or USB 3.1 Gen 1, if you prefer the new terminology!).

So what is the advantage of USB-C when transferring files using USB?  For time taken to transfer a file, not much (well, till you go with a SSD external), but have you ever been annoyed by that bulky AC Adapter powering your external HDD?  Well, USB-C can do something about that.

USB-C Power Delivery

The original USB specification allowed for up to 150mA of power at 5V, just 0.75W of power.  That was fine to power a mouse or keyboard but was never going to be adequate to power external HDDs or other more demanding devices.

USB 2 gave us a useful increase up to 2.5W and then USB 3.0 to 4.5W, and a power charging option was introduced at 7.5W but not simultaneous with data transfer.  Power supply was only at 5V.  Small steps, but it allowed for 2.5” HDDs to be powered off USB and mobile devices such as phones to be charged off USB ports, if over a long period of time.

With USB-C connectors we are finally seeing a major increase in overall power, and this time, at varying voltages.  The new plugs can support up to 5A at 5, 12, and 20V, potentially giving 20V x 5A = 100W of power.  Simultaneous data transfer is supported even at maximum power output.

USB_Power_Delivery

This sort of power delivery will allow for substantial devices to be powered from the same cable carrying data.  A desktop sized monitor needing just one cable from PC to screen is a good start, getting rid of AC adapters from 3.5” external HDDs is another use.

The new power spec allows for power to run in either direction. Get home and plug your notebook into your desktop monitor by USB-C and let you monitor charge the notebook.  Goodbye power brick.

The management of power between devices has also seen major improvement.  Where power is limited and needs to be shared across multiple devices, the protocol allows for a varying of power supplied as demands vary.  For example, if a laptop is pulling a lot of power off a desktop USB port to charge, and then the monitor, also powered off USB is turned on, the power available to the laptop can be reduced to allow the monitor to run.  How well devices play together will be interesting to see!

Alternate Mode to support DisplayPort, Thunderbolt, and more

So getting back to that file transfer speed, and more generally, moving data fast, perhaps faster than USB 3.1 allows, does USB-C give us any other options beyond the 10Gb/s of USB 3.1?

Well, yes.  The USB-C specification allows data protocols outside the USB specification to be supported through an alternate mode system where the device can negotiate control of certain pins to be reconfigured to support data streams outside the USB data transfer specifications.

When no other demands are placed on the cable, USB 3.1 can use 4 lanes for USB Superspeed data transfer, but when using alternate mode some or all of these lanes can be used for other types of data signals.  These signals are not encapsulated within a USB signal, rather the signal is supported directly on the wires with the alternate mode protocol managing the signals allowed to be sent.  Depending on the configuration, various levels of USB transfer can coexist with these alternate protocols.

The most popular protocol supported by current USB-C implementations is Display Port, allowing a 4k screen to be driven at 60Hz with 2 lanes while still allowing USB Superspeed simultaneously.  Adapters are also available to attach an older monitor with DisplayPort or HDMI to a USB-C connector.  Not all USB-C plugs will support Display Port Alt Mode, so take care to check the specs of devices where you might want to use this feature.

Thunderbolt 3 (40Gb/s) is supported in some implementations of USB-C, providing a wide range of data types including high resolution video, audio, and high speed, 10Gb/s Peer to Peer Ethernet.  That last one is particularly interesting as a modern version of a laplink cable by cutting out the middle man and providing a high bandwidth device to device connection ten times faster than available across most LANs.

USB-C Availability

At time of writing, USB-C is available across a modest range of products, and many newly announced products are supporting the technology as they hit the market.

We are seeing niche products such as portable monitors powered through USB-C, but do not expect to see widespread peripheral support until the installed base of USB-C reaches critical mass, probably less than a year away.

To help build that installed base, we have motherboards (used for desktop computers), notebooks, and tablets with USB-C support, as well as expansion cards that enable USB-C on legacy desktops.

If buying now, USB-C is not a essential purchase, but for a machine you expect to have in use for at least the next few years, its a desirable feature and worth considering during the hunt for your next computer.