By Rich Savastano, Senior Account Executive
Technology continues to moves at an incredibly rapid rate. What will be on the IT priority list for 2013? According to industry pundits and what we’re seeing out in the real world, here are some hot trends that we’re keeping a close eye on:
BYOD – The Bring Your Own Device (BYOD) phenomenon is here to stay with even the most highly-regulated industries embracing it. Organizations are trying to keep pace ensuring security, data protection, access to corporate applications and compliance. The first step is to build a mobile strategy that gives users the device freedom they want without sacrificing security or compliance. Having the capability to separate personal and corporate data, securing data at-rest and in-transit and finding the right mobile device management solution are all critical components of a strategic BYOD plan.
Cloud Data Protection and DR – The cloud can provide a secure and cost-effective resource for backup and archiving. So how should you incorporate cloud storage into your data protection plan? Key considerations include looking at the type of data you want to be stored in the cloud, conserving and optimizing bandwidth, and whether a public, private or hybrid approach will be best. In the end, the cloud can be an effective way to securely store data off-site for disaster recovery and improve RPOs and RTOs while lowering costs and freeing IT staff from tedious backup tasks.
Big Data – Unstructured information is the biggest contributor of data growth. Making sense of this data and extracting value is critical and traditional relational databases were not designed for decoding this type of information. Hadoop can handle the extreme volumes, but there are complexities. Organizations that take a proactive approach to managing big data and leverage the right data analytics will have a competitive advantage.
Software-Defined Networking (SDN) - Software-Defined Networking (SDN) - There is quite a bit of buzz surrounding SDN, especially since VMware’s $1.2 billion acquisition of Nicira. The goal of SDN (spearheaded by the Open Networking Foundation) is to bring the key characteristics of server virtualization to the network. The key concept of SDN is to decouple the network control plane from the network data plane, allowing for a “flat” network within a datacenter or between geographically disparate datacenters. This function allows for greater mobility of virtualized workloads throughout your enterprise, including the ability to move workloads to and from cloud environments without the necessity to change your IP scheme. SDN is being adopted at a rapid pace within IaaS public cloud providers and emerging in private cloud solutions as well. The last hurdle to truly mobile applications and fully utilized global resources has been the network, SDN has been developed to bring us into the era of mobility!
By Tim Donovan, President
You may have read recently about a number of cloud breaches which call the security of cloud services in to question, but were these really issues with the cloud service or were they the result of a failure to follow basic security best practices? I found these 5 top safety tips from John Sutter at CNN pretty interesting. They are all related to preventing hackers from getting to your personal data and private information and most of them, like not using the same password for every site you visit, are really just common sense.
But it got me thinking about cloud risks and how to mitigate them. When it comes to preventing a personal data breach, the onus is on you to make sure your personal information is not compromised, when it comes to corporate data, do you want that sole responsibility? I don’t think so. Mitigating the risks of storing corporate data in the cloud goes beyond common sense, but for experts like Daymark it is part of our everyday life. We live and breathe the best practices to protect data. So if you’re considering cloud services, give us a call. In the meantime, I don’t want your kids’ pictures to be lost or worse, your bank account hacked, so check out this article and stay safe.
By Ned Fairweather, Senior Consultant
Symantec recently released NetBackup 7.5 for General Availability. The new release boasts several enhancements that broadens the NetBackup platform and streamlines previously daunting administrative tasks. The following are a few highlights:
- Cloud Services
- Replication Director
- NetBackup Accelerator
- NetBackup Search
Let’s start with the Enhanced Features first:
The Media Server Deduplication Option (MSDO) introduced with the advent of NetBackup 7 allowed users to create a deduplication target on local or SAN attached disk. Database restrictions did not allow the disk to exceed 32TB in size. NetBackup 7.5 doubles that limit and allows for up to 64TB of deduplication disk. Performance of the deduplication process has also been improved. Specific presentations of iSCSI disk are now supported with MSDO as well which lifts the requirement of fibre channel connectivity.
Symantec’s V-Ray technology is built into all their backup products. V-Ray leverages the inherent tool sets within leading hypervisors like Hyper-V and VMware and blends those capabilities with their Open Storage Technology (OST) plugins for deduplication disk devices. Recovery capabilities at the application level and new support for vSphere 5 are also touted in this version.
All businesses have either heard of and/or are investigating cloud based solutions for several different aspects of their IT infrastructure. Backup is a primary one. NetBackup 7.5 has introduced integration with several new public cloud providers. Symantec has also enhanced their encryption option for their cloud based backup solutions.
Now for the brand new features in NetBackup 7.5:
Symantec’s agnostic approach to the backup target technologies has always been a highlight of NetBackup. Their OST plugin which allows the hardware vendors to integrate seamlessly with NetBackup is reaching further into the foray of array management. NetBackup 7.5 has introduced a new feature called Replication Director. This is aimed at allowing NetBackup to control NAS device operations, like snapshots and replication, which formerly were handled at the array level. NetApp is the first vendor to work directly with Symantec on building a plugin for their platforms. Bringing that level of array management into NetBackup allows for the storage administrator to leverage a single pane of glass simplifying their day-to-day management.
Meeting backup windows to ensure the integrity of data and limit the impact on client systems is always at the core of an administrator’s daily duties. The process of identifying files for backup and changes to those files on a client system can be a time consuming task especially when a file system contains file counts in the millions. Traditionally NetBackup would walk the file system to identify what is required to be backed up at the start of a schedule. NetBackup 7.5 has created NetBackup Accelerator which uses a unique indexing system on the client to identify changes as they happen. This removes the requirement for NetBackup to process this during a backup window. Since changes have already been cataloged, NetBackup can immediately start its write operations. This new efficiency significantly decreases the backup windows for all clients including previously problematic ones. This technology also leverages what Symantec calls Optimized Synthetics, allowing the administrator to create a full backup from all the tracked changes similar to Symantec’s Synthetic backup option. Overall, NetBackup Accelerator has shown significant decreases in backup windows as well as decreasing the bandwidth and resource time associated with a traditional backup.
Legal holds are becoming more frequent as E-Discovery requests are typical of most companies’ litigation procedures. The last new feature added to NetBackup 7.5 is NetBackup Search. In the past, backup administrators would have to navigate through hundreds, if not thousands, of images in order to identify a custodian’s data footprint. This process is typically laborious and quite often cannot meet a legal team’s timetable. NetBackup Search streamlines this process and allows a backup admin to find the specified information efficiently and maintain the proper retention holds that a court may require. Additionally, NetBackup Search naturally dovetails with Symantec’s other E-Discovery products, Enterprise Vault and Clearwell, with the ability to import NetBackup information into those products and use their advanced capabilities.
In order to take advantage of these features, one can install a net new environment or upgrade an existing implementation. There are minimal requirements to upgrade and leveraging the Symantec 7.x portal is a great way to identify the best practices for upgrading, http://www.symantec.com/business/support/index?page=content&id=TECH74584. Any 6.x (End of Life for 6.x due in October 2012) or 7.x environment can be upgraded directly to 7.5. Of note, this particular upgrade merges the image catalog with the underlying NetBackup database. Merging the Enterprise Media Management (EMM) database with the image headers improves several notable issues with previous versions. While the upgrade to 7.5 may add some new complexities, it is poised to eliminate previous catalog consistency problems and improve performance for restores, scheduling, image cleanups, and searches in large catalogs.
Author: Kushal Patel, Senior Consultant
Is the cloud turning IT upside down? All the talk around IT as a Service and the cloud can make even the most experienced, seasoned IT leaders worry that their team may not have what it takes to keep up. If you listen to some of the pundits, they would have you believe that IT teams are in trouble of becoming obsolete due to a lack of appropriate skills and understanding of business processes. While the cloud is creating a paradigm shift, I think many of the “experts” are overreacting and underestimating IT’s resiliency.
How many times has IT already “evolved?” There’s been the client/server revolution, wireless networking, mobile support, fighting malware, and virtualization of just about everything from servers and storage to applications and desktops. The very nature of technology, and therefore the job of IT, means it is impossible for the technology, or the role of IT, to be static. The technical know-how of IT will always be a necessity; this time it may be more about a shift in mindset that requires a realignment tied to process and workflow development. If IT leaders drive the “attitude adjustment” of its team and reassign priorities to be more tightly aligned with business goals, the team (and its skill sets) will evolve – yet again.
So what will the real impact of the cloud on IT’s role be? As IT shifts into the “cloud era,” so must your organization’s skill sets and goals. The ability to identify and leverage resources – whether they come from the cloud or the company’s own data center — is becoming a key part of IT leaders’ responsibilities and this mindset should also trickle down to your administrators and engineers. Having a unified message that IT should be a business unit that provides innovation in line with the business’ goals is essential for the evolution of IT.
And by the way, it doesn’t stop at IT. Finance and procurement also need to understand how the new cost model of cloud computing will affect the budgets for the organization. Heck, they might actually be the ones driving you towards the cloud. It will be interesting to see if they can evolve as quickly as you!
Author: Sean Gilbride, Director of Professional Services Operations
NetApp has been very busy over the past 12 months working on refreshing their entire Unified Storage line. NetApp refreshed their mid-tier (32xx) and high end platforms (62xx) in late 2010 and has recently released their new entry level platform the FAS2240.
The release of the FAS2240 marks the end for the older FAS2020 & FAS2050 systems. NetApp will still be offering the FAS2040 (refreshed in 2010) at an aggressive price point to target EMC VNXe sales.
Don’t let the entry level designation fool you, the FAS2240 can handle mid-tier workloads
- The FAS2240 has been released in 2 flavors
- FAS2240-2: 2 RU system supporting 24 internal 2.5” SAS drives
- FAS2240-4: 4 RU system supporting 24 internal 3.5” SATA drives
- Both systems support up to 144 drives (432 TB) using external shelves
- Both systems support SATA, SAS & SSD on external shelves
- 2x – 3x performance improvement over the previous generation (mid-tier performance)
- The FAS2240 can be converted in to a disk shelf when upgrading to a larger array
- More software & capabilities included in base licensing (All protocols included)
- Simplified management with OnCommand System Manager 2.0
- Support for Data ONTAP 8.1
- Support for 8Gb FC & 10GbE (via mezzanine I/O card)
So why is this important?
NetApp has recognized the need for a refreshed entry level platform and has made several important improvements outside of the expected performance boost & increased port density. These improvements include support for Data ONTAP 8.1 Operating System with large aggregate support, support for 8Gb FC or 10GbE and Multi-Path HA cabling for SAS disk shelves to mention a few.
NetApp also continues to highlight the value of their unified storage platform which leverages the same controllers & operating system for every system they offer. This is critical when considering the life of the platform and the importance of enabling simplified upgrades as customer requirements grow or change. With NetApp and upgrade is usually as simple as performing a head swap.
What is it missing?
The FAS2240 is intended to be an entry level box so it does not include support for PCI expansion. The most notable impact of this is the lack of support for FlashCache. The FAS2240 also does not include support for MetroCluster which enables long distance clustering. Both of these capabilities are available starting in the FAS32xx series.
Author: Bruce Hall, Director of Technology
You had to be living ‘off-grid’ this week if you didn’t hear about the widespread problems that resulted from the millions of consumers trying to download Apple’s new iCloud sharing and data protection service… come on you know you contributed to that problem, I know I was.
If I can secure a simple solution to automatically share all my digital media with each member of my family, from any device while having off-site protection and reliable, simple recovery…all at a reasonable, predictable, monthly cost - just tell me where to sign up! The widespread adoption of these consumer technologies (dropbox.com, box.net, online-backup solutions etc…), begs the question why can’t the same benefits be realized for companies of all sizes?
Unstructured data, one of the biggest drivers of explosive data growth is particularly well suited to cloud storage. Some early cloud-based solutions attempted to address this challenge with a 100% cloud solution, providing a gateway to direct primary data to and from the cloud. The industry quickly learned that a hybrid solution is what consumers need, a combination of an onsite device for performance and availability and off-site capacity and data protection. This led to the next round of hybrid NAS solutions. Imagine an on-premises device that has the intelligence to cache the most frequently accessed files, based upon end-user demand, while seamlessly keeping 100% of all data in the cloud and moving data back and forth on-demand with limitless retention. Then layer on multi-site access to the same file-space simultaneously, tablet and smart-phone access from anywhere in the world, and collaboration features, all for a predictable per gigabyte monthly charge. Further imagine that disaster recovery of all this unstructured data is as simple as powering up a virtual machine and entering service credentials. Within minutes the entire directory tree is presented and file restoration from the cloud is automatic, prioritized by end-user demand.
Cloud/on-line backup is not just viable for consumers and small businesses either. With hybrid and private cloud solutions, on-premises devices provide high-performance backup and restore with automated off-site protection and reliable recovery, with the efficient block-level incremental forever (deduplication) technology to handle the volumes of today and tomorrow. Engaging the right Managed Service Provider (MSP) to bring the expertise and services to phase in this technology can protect recent investments while reducing the legacy environment as time passes, until all historical data has expired and the legacy solution can be fully retired. Free up your existing resources for more important tasks and gain control of your data protection challenges with a reliable solution for a predictable monthly per gigabyte cost. It’s getting close to being as simple and cost effective as the iCloud.
Author: Ned Fairweather, Senior Consultant
Symantec just announced general availability for Enterprise Vault 10 – the newest version of its email and content archiving software. Is it worth upgrading? I think so and here’s why: The current construct of Enterprise Vault (EV) is a pure datastore. Symantec has added intelligence to the information stored as well as context to unstructured data. Social media archiving (pending Data Insight rollout) as well as integration with the cloud have also been planned in the release. The recent acquisition of Clearwell is helping bring Symantec’s best in breed archiving together with eDiscovery.
EV is looking to increase governance via content based searching and increase business value by providing classification. They are using RAIL (Rapid Agent Ingestion Layer) to achieve content specific abilities based on lines of business. Scalability is focused on the petabyte sized data stores.
Improvements are being made to event-based retention and expiry filters to enhance management of deletion for regulation and/or reclassification. Additional focus will be on solving the requirements for new cloud-based email systems outside of Microsoft, IBM, and Google. In addition to cloud solutions, pst file sprawl on file servers will be included.
Enhancements to EV 10 are as follows:
- Full 64 bit index with GUI vs. a few scripts to build on
- 6x faster searches
- Storage footprint remains at 12% overhead as in previous versions
- Support for Outlook 2011
The future of EV should make complying with legal team’s eDiscovery requests easier. Structured data archiving is being worked on as a partnered solution (i.e. Oracle, Informatica).
Getting back to whether or not upgrading makes sense. EV 10 has some great new features; however upgrade paths between some versions are not direct. Upgrades would need to be run from one version to the next to get to 10. Here are the details on supported upgrade paths: http://www.symantec.com/business/support/index?page=content&id=TECH53174 as well as the official word on EV 10 from Symantec: http://www.symantec.com/about/news/release/article.jsp?prid=20110801_02
Author: Jeff Choinski, Consultant
Symantec held their annual World Sales & Marketing Conference July 10-15, 2011, in Las Vegas. This year, they combined the partner training program with their annual system engineer training. It created an opportunity for partners to meet up and trade stories and also hear experiences from Symantec’s SEs.
This year’s main themes focused around the cloud, protecting virtual environments and how Symantec’s products fit into this ever evolving IT environment. Products such as NetBackup for VMware, helping protect a company’s virtual environment using features like Automated Image Recovery (AIR) and allowing for automatic detection of newly created guests to granular recovery and dedupe with their V-Ray product integration are just a few features Symantec has to offer. The “.cloud” product set is now enabling companies to offer backup and archive as a service offering, further maturing their “IT as a service” models, while at the same time improving availability and reducing costs. The appliance offerings in both the NetBackup and PureDisk space are giving organizations a scalable all-in-one solution to protect their data. Symantec has also expanded hardware configurations to the NetBackup appliance, accommodating network environments from 1G to 10G or to support their fibre channel infrastructure. ApplicationHA for VMware adds another level of protection for Windows and Linux VMs by providing a product that is not only guest aware, but application aware. This provides the ability to stop and restart applications when failures occur, instead of the entire VM. Working with VMwareHA, you can restart and recover virtual machines as well, if necessary. The net is you can run more business critical applications in a virtual environment, without having to worry about outages and downtime. Keep an eye out for more releases this year from Symantec with more products that enable businesses to protect their data, move to the cloud and reduce downtime.
Author: Jake Roczniak, Consultant
Last month EMC held its annual user conference, EMC World 2011, in Las Vegas. Each year the company chooses a theme which broadly defines their core focus for the event. This year’s theme was “Cloud Meets Big Data”. In his keynote, EMC CEO Joe Tucci exclaimed that EMC’s role was “… to lead customers on their journey to cloud computing and transforming IT.” The “Big Data” aspect EMC is referring to is the fact that the so-called digital universe will contain 35 zettabytes of information within the next decade. IDC is also expecting server images to grow by 10x in the next decade. So not only will servers continue to get more powerful but they will also multiply wildly. EMC introduced what it is calling “The EMC Big Data Stack” defining their view of how to store, manage, and act on the big data coming downstream. They are also aligning much of their product set to be efficient in their vision of a hybrid-cloud model. EMC made many announcements - some that I think will be the most interesting to keep an eye on include:
Greenplumb & Hadoop - a “big data” analytics hardware platform
Project Lightening – Flash based PCIe server side device for moving workloads around, to and from the storage array to the physical server itself, utilizing FAST
All Flash versions of the VNX and VMAX
Isilon 108NL – New hardware that can reach a 15 petabyte file system in a single volume
VPLEX Geo – Create a federated storage pool at synchronous distance
Atmos 2.0 – The second generation of EMC’s globally scalable storage system
Were you at EMC World this year? If so, what did you think?
Author: Kushal Patel, Senior Consultant
Really? Is everyone that surprised that a cloud provider had an outage? An Amazon EC2 service disruption is never timely, but anyone with a well-planned DR strategy should not have been affected. If you want to know what happened, you can read the Amazon post mortem here.
This begs the question: “Are users of cloud service providers neglecting to consider Disaster Recovery as part of their new cloud based architecture?”
Simple answer: “If they are, they shouldn’t…”
The main message here is, read the Cloud Providers’ SLA’s, compare them to your Recovery Time and Recovery Point Objectives and plan accordingly. The location(s) of application, compute, network and storage resources, whether in the cloud or on-premise, does not preclude an organization from planning for DR. This includes Infrastructure, Platform, AND Software as a Service.
Consult with a DR specialist to create a design that encompasses all of your critical resources and adheres to your businesses availability needs. Like I said, “You get what you plan for…”
For those of you who were affected by the outage, I truly am sorry for your inconvenience, but I thank you for the lesson.