Month: February 2016

Tips for Successful Project Delivery: Customer Engagement, Respect and Communication

download (3)What if a professional athlete set a standard where winning was not enough? Instead, they had to achieve a personal best or break a previous record year after year.

What if a new theme park opened on schedule, with no delays, and offered tickets to the first one million visitors to return at any time and bring up to 100 guests at no additional charge?

Welcome to my world. As an IT provider, I face the similar challenge: that is, delivering a project experience to customers that will not only achieve all project goals, but also blow them away.

I have delivered on hundreds of projects for customers in my career and I have seen projects go smoothly and poorly. I have seen projects end with both the customer and the provider feeling a sense of accomplishment, and I have seen projects drag on for months, even years and then dwindle out almost as if customer and provider conceded defeat for any of the following reasons:

  • lofty project goals
  • misjudged budgets
  • technology that couldn’t be wrangled in

Sound familiar to anyone? These are some of the reasons why PMI (pmi.org) reports that 89 percent of projects at high-performing organizations meet their original goals and business intent, compared with just 36 percent at low-performing organizations.

The Cost of Poor Performance

Those low-performing organizations also lose 12 times more money than high-performers.

My customers include professionals in all aspects of IT service delivery. Their business and IT needs are great because so much depends on the success of these projects-their budgets, their revenue goals, their own staffing decisions, their perceptions to upper management, and the perceptions of other customers.

But what many people don’t realize is the poorly performing projects hurt both customers and providers equally. Obviously the customer is frustrated and perhaps feels slighted in what they are getting versus what they are paying for. These kinds of projects severely impact the provider as well. The provider’s number one priority is to deliver on the scope of the project to the customer. That has to be the most important principle for a provider, held above all else, because a project that ends with an unsatisfied customer is a complete waste of everyone’s time. However, a very close second priority is delivering a project quickly and efficiently, even when there is no time pressure from the customer.

Long-running projects incur overhead in several forms. As projects run late, the provider may now have more concurrent active projects. Their engineers have to split their time and attention between two or more projects which can result in lower quality. The longer the project goes on, the more disconnected the team can become, momentum slips, and decisions made early on can start to be questioned. Changes in direction often delay the project even longer and more meetings are likely to occur. For a typical small project with just five resources, a two-month delay can easily incur 50 hours of additional time.

I have found that successful projects that avoid these pitfalls and end in mutual accomplishment always require both parties to be fully engaged and invested. Since the nature of project delivery is a client/merchant one, it is up to us as IT service providers to ensure that engagement happens and to drive mutual investment in the outcome.

Customer Engagement

First, let me expand on the benefits of customer’s remaining actively invested in their projects. When a customer signs a statement of work (SOW) for a project, they agree to pay some amount to have work done. Whenever money changes hands like this, a sense of entitlement on the customer’s part can sometimes emerge that often goes like this: “I did my part by paying you, now you go deliver on what I paid for”.

I want to be clear and say this is perfectly understandable and not completely unreasonable. However, as providers striving to fully deliver on customer needs and goals, we need the customer to remain engaged and part of the process. I call it everyone in the boat and the metaphor is interesting to me because you can think of it as the project team bringing the customer to the goals rather than bringing the goals to the customer. In the boat, the provider is the captain and crew of a private cruise liner and the customer is the pampered passenger with input on where the yacht goes.

In the end, however you conceptualize it, a customer that is engaged in a project is less likely to be critical of decisions made about direction and design and more likely to feel some ownership in the outcome. A customer who is part of the process is less likely to criticize than one who remains distant as an observer. In my experience, projects with high customer involvement always end smoothly with a sense of mutual accomplishment. They often build lasting business relationships between provider and customer.

Let’s examine some tactics to improve customer engagement and buy-in. The following two main methods get customers engaged in projects, help keep them engaged, and improve efficiency as you work.

Method 1: Build Trust and Respect Between Project Team and Customer at the Start

Building mutual respect is a key to smooth projects. Mutual respect means that decisions can be made about the project constructively and without dissent. There are several aspects to building a relationship based on mutual trust and respect.

First Impressions: The old cliché is true; there’s only one chance at a first impression. Moreover, a good first impression only lasts as long as you live up to it. The minute you falter, the good first impression is gone, so it is critical that you stay consistent in your positive interactions. Do your homework and make sure all project team members know the project inside and out and are ready to speak authoritatively on their parts before engaging the customer’s team.

Mutual Decision Making: Next opportunity for building trust and respect is the experience you bring the customer in mutual decision making. As the provider, it’s important to take the time to lead them through the decision process. Where there are no customer opinions, backfill with yours. When a customer has a strong opinion on a topic try to yield to their desires. When the customer desires are not aligned with your agenda (best practices or efficient execution) then you must engage them in dialogue. That dialogue must always be grounded in respect for the customer’s point of view and focused on a mutually beneficial resolution focused on the goal not the execution (the what, and not the how).

Respect for Time: While keeping the customer involved, we never want to waste their time. Guide them to focus their attention on the important parts of the project and not the mundane details. Customer’s should be engaged in decisions about whether or not to do something but not necessarily about how exactly to do that thing. Customer’s should be appraised of the how, but in more of a review format to build buy-in for execution.

Execution: One sure-fire way to lose respect of the customer is to fail to execute. Always do what you say will you do, when you say you will do it. As mentioned above, mess this up once and you’ve lost the game. For that reason, it is very important that you are realistic about what you say you will do and when you will do it. Set yourself up for this, you are in control of the expectation and the execution. If you have a perfect track record of execution, the customer won’t have a reason to question your plan.

Method 2: Communication

The what, when, and how of communication can really make a difference in projects. Separate customers will react in different ways to your communication methods. For example, one might prefer a regular status update in e-mail while another one expects to view a milestone report with a summary of weekly achievements.

Goals: The very first communication engagement should be about establishing project goals. This may or may not be adequately defined in the presales process so it’s the first opportunity to interact. If the goals have already been adequately defined, then the provider’s role here is to articulate these goals back to the customer to make sure customer and provider share the same vision of the goals. If they are not the same vision, or the goals have not been adequately defined, this engagement is the first opportunity for customer and provider to collaborate and build mutual trust/respect.

Level of Detail: Meaningful ongoing communication should be tailored to the individual customer. There is no right way to go about it. Too much can be a turnoff for customers and will result in them disconnecting, too little and they’re wondering if you’re making any progress at all. I personally like the more frequent informal contact with periodic formal updates. Keeping with respecting the customers time concept, the updates must be meaningful and relate back to their business needs, not related to gory details of execution. Consider a daily dashboard with a series of weekly reports.

Creativity vs. Execution:

Good project delivery creates a line between creativity (design) and execution (plan). Customers lose faith if you are months into a project and need to redesign some work item every week. Attempt to get all design details done and communicate about those design decisions up front. As a provider, walk through the whole execution conceptually and figure out all the questions that need answering first. Engage the customer in a high-level walkthrough of the project and derive answer to those questions. During the design stage, gather information and understanding from sessions with the customer but organize the designs into work plans away from them to save time (yours and theirs). Present and review for final approval. Once you both agree on all design elements, close the design discussion, and begin executing to a plan/timeline. For large projects, break this cycle up into chunks if appropriate.

 

What Tool Helps You Conquer the 7 Biggest File Migration Challenges?

download (2)Want to eliminate platform migration headaches? Reduce cost, effort, and lost time? You’re in the right place. I want to share some tips to help you simplify your migration activities.

I have been performing file-based migrations for well over 20 years, primarily with EMC Technology. The majority of migrations have been SMB-based including server-to-server, server-to-NAS, and NAS-to-NAS migrations.

Some of the tools that I have leveraged over the years include:

  • ROBOCOPY: This Microsoft utility, originally part of the Windows NT 4.0 Resource Kit, has been around since 1997.
  • EMCOPY: This EMC utility is used primarily to copy to a Celerra based file system; however, the other tools in the suite comprised of EMCopy, Sharedup, and LGDUP have been used on several occasions, even in conjunction with Robocopy.
  • RSYNC: This UNIX utility is used for NFS migrations.

As technology advanced, storage vendors provided a way to perform these migrations using internal tools. Some of the tools were developed for disaster recovery while others were developed explicitly for data migration. Two examples include:

  • VNX Replicator: This tool can replicate from Celerra/VNX to another Celerra/VNX array.
  • Isi_vol_copy: The isi_vol_copy utility from Isilon uses NDMP streams to copy data from a NetApp array to the Isilon array. Eventually EMC added isi_vol_copy_vnx that allowed NDMP- based copy from VNX to the Isilon.

Considerations: One of the major considerations that we face when performing file-based migrations is which tool is the best for the job. The answer to this question is not easy because it may take multiple tools to complete the job. Another consideration is the ability to streamline the data migration process.

Among the better known file migration tools that can address these considerations: Datadobi DobiMiner. For the last five years, I have recognized Datadobi for their CAS-to-NAS migration methodologies and their NAS-to-NAS migration tool, which is always evolving. DobiMiner helps streamline file-based data migrations, simplifies the entire end-to-end process, and tackles all of the following additional challenges:

Challenge 1: Bandwidth Throttling

Customers may need to throttle the bandwidth that is consumed by the migration. This may limit the amount of concurrent sessions or the actual time required to stop the migration if necessary.

DobiMiner can schedule bandwidth based on hours of operation per file server. This feature limits bandwidth during normal business hours so that migrations continue to run with reduced bandwidth and then automatically increases it during the next window. Imagine the flexibility to change schedules based on file server and managing multiple bandwidth schedules over the duration of the migration phase. It also includes file server scans to refresh the data.

Challenge 2: Detailed Reporting

Customers often ask to see a concise daily status report on all migration jobs. This includes successes, failures and the details surrounding both cases. Other tools do have log files associated with them, but it will take some tweaking to get an exact report. The ability to email a report, if supported, will typically require additional configuration effort.

DobiMiner generates reports on-demand. In addition to the source and target destination, the reports show the number of files, directories, symbolic links and errors. In the event of an error, the reports allow you to drill down to analyze the cause. Migration reports can be emailed daily, weekly, or monthly.

Challenge 3: Job Scheduling

Job scheduling needs to be done from the Windows scheduler at each server that is performing the migration job. The job schedules can overlap if multiple proxy servers are used for the migrations.

DobiMiner can schedule steady state incremental copies. Each individual migration can be scheduled to run based on the migration effort. For example, the migration can start with daily incremental copies and then be changed to hourly as the migration cutover window arrives.

Challenge 4: Estimating Switchover Time

This manual process includes reviewing the logs from each incremental run to estimate the time it takes to perform the incremental copy.

DobiMiner allows you to create migration windows based on the cutover time. DobiMiner determines if the final migration will fit within the specified time window. A dry run can also be performed to simulate the final copy.

Challenge 5: Creation of Migration Jobs

Job creation can be scripted, but needs to be imported on each server that will be performing the jobs, and then scheduled per server. For Windows, a batch file needs to be created, then scheduled with the Windows scheduler.

DobiMiner can perform a bulk import of not only the file servers, but also the migration pairs from a template. After the template is created, there is an import function that will create the migration jobs at the DobiMiner software level.

Challenge 6: Incremental Copies

All the tools mentioned have a way to perform incremental copies. Some need to perform a file compare and could take hours to find the one file that needs to be copied. Others will not synchronize the directories as per NDMP-based protocol.

DobiMiner performs a scan of the directory faster than the other tools mentioned. The scans are performed on a scheduled basis and information is stored in DobiMiner.

Because it scans and performs incremental copies faster than the other tools mentioned in this article, it helps speed up the migration saving you precious time.

Challenge 7: Mix of SMB and NFS Clients

If a customer wants to copy NFS files as well as SMB files, then two tools will be required, such as RoboCopy and Rsync.

DobiMiner migrates both NFS and SMB files in a single pane-of-glass. It supports the migration of SMB, NFS, and even mixed-mode protocols.

Conclusion:

It’s no surprise that DobiMiner has been named as the tool of choice for file-based migrations on EMC Unity and other platforms. In my experience, DobiMiner has been a welcome addition to the file migration process and continues to impress me. DobiMiner addresses all of the current challenges presented in this article and offers technological advantages over other legacy tools.

 

Tegile: Economical Way To Accelerate Enterprise Applications!

download (1)Gone are the days when companies used traditional flash to handle workload. There are several drawbacks of this conventional storage – you can use one flash for a specific type of workload; you can get the fast performance by spending several bucks, and several others. Tegile is the company of a new breed of innovators which brought its products to reduce the dependency of enterprises on heavy and expensive systems. Its products (hybrid and all flash) are enough to handle any type of workload.

From small-scale industries to well-established organizations are using its system. The company introduced its eccentric products keeping the users’ demand in mind for both All-Flash and Hybrid. All of its solutions are based on two types of architecture. Intelligent Flash Arrays for Hybrid & All-Flash, and IntelliStack Converged Infrastructure solutions.

The IntelliStack is developed by integrating the computing power of Cisco UCS and storage feature of Tegile. And thus, you can provide the configurations which are certified, pre-validates and pre-sized. If you are looking for a system, which can work best for your latency-sensitive, business-critical applications, then you should opt for the all-flash array. These arrays deliver the high performance result at economical cost, cutting the operational cost. This is best for online transaction processing, data warehousing, decision support, and real-time analytics. It has four models- T3500, T3600, T3700, and T3800.

If you are looking for the system which can leverage the hard disk density and the flash performance, then this system has the features which keep the capability of accelerating business applications. They are available in 8 different models, named T3100, T3200, T3300, T3530, T3630, T3730, T3750, and T3850.

All the models are based on the IntelliFlash architecture, which is designed to leverage various types of storage media, which are managed by this flexible operating environment to deliver the best performance at an economical cost. High-density flash, high-performance flash, and hard disk can be managed intelligently, because this system has multi-protocol support, advanced data services, and several other features. It has several advanced features which let you contract the storage footprint, consolidate workload, and maximize the uptime.

Another best thing about this system is its support and analytic part. IntelliCare will definitely grab your attention and will compel you to make this new array your solution for enterprise needs. This is the support platform which is backed by the company’s own support team, but is driven by Cloud Analytics.

Overall, this new breed of innovators has captured the whole IT market in a very short time span with its incredible products, which are known for its extravagant high performance and economical cost. Consolidation and virtualization are now no more a big dad’s thing. Small-sized industries are also enjoying the uninterrupted service of Tegile. You can also expect the high-level of integrity through these arrays, because it has enterprise MLC SSD, which makes its products best in terms of quality and security.

If you want to ease your business workload with boosting performance, then you should start planning for buying any of the Tegile products which can best solve your purpose with its rich features

 

Cloud Computing!

downloadCloud computing has revolutionized the way technology is used to share information and resources to achieve coherence, relevance and economy of scale. These three factors are hugely important today when individuals and businesses require being in the forefront of their activities and achieving profits and revenues while reigning in expenditure.

This kind of computing is the method or model of internet-based computing that provides on demand, processing capabilities as well as data to computers and other devices on a network through a shared pool of resources such as applications and services, networks, servers and storage devices, which can be requested and used with minimal effort. Cloud computing enables businesses and users with capabilities to store and process vital data in third-party data centers.

In simple terms, cloud computing means the storing and accessing of information and applications over the internet instead of leaving them on local hard drives or in-house servers. The information accessed is not ‘physically close’ and the metaphor ‘cloud’ relates back to the days of flowcharts, graphs and presentations where the server infrastructure was depicted as a ‘puffy, white cumulus cloud’ that stores and doles out information.

Cloud computing or ‘the cloud’ as it is commonly known enables a ‘pay as you go model’. The availability of low-cost computers and devices, high-capacity networks and storage devices as well as complementing factors like service-oriented architecture, adoption of hardware visualization and utility computing have contributed to the success of cloud computing in a very big way.

Cloud Computing Architecture

The five specific factors that define cloud computing are:

• Broad network access
• On-demand self service
• Resource pooling
• Measured service
• Rapid elasticity or expansion

Broadly, that sums the essence of this kind of computing. However, there are several loosely coupled components and sub-components that are essential to make computing work. These are divided into two sections – the front end and the back end which connect to each other via the Internet.

The Front End is the physically visible interfaces that clients encounter when using their web-enabled devices. Not all computing systems use the same interfaces.

The Back End comprises all the resources that deliver cloud computing services. These are essentially virtual machines, data storage facilities, security mechanisms etc. that together provide a deployment model and are responsible for providing the ‘cloud’ part of the computing service.

Benefits

Exponents of computing are quick to praise it citing the many advantages and benefits it provides. Among the many benefits, the prime ones are:

• Enables scale up and scale down of computing needs
• Enables businesses to avoid infrastructure costs
• Allows companies to get applications running quicker and faster
• Improves manageability and adjustability of IT resources to meet fluctuating business demands
• Reduces maintenance

The high demand for cloud computing is further enhanced by the advantages of cheap service costs, high computing power, higher performance and scalability and easier accessibility and availability.