Wednesday, July 25, 2012

How strategists lead

I ran across this article in a McKinsey report and thought I would share. Not that Harvard professors have all the answers, but an interesting perspective from someone exposed to a lot of very successful business people.  

I can honestly say I am working in an organization that fully understands this...    


How strategists lead


A Harvard Business School professor reflects on what she has learned from senior executives about the unique value that strategic leaders can bring to their companies.



how strategists lead article, strategist as meaning maker, Strategy

In This Article

Seven years ago, I changed the focus of my strategy teaching at the Harvard Business School. After instructing MBAs for most of the previous quarter-century, I began teaching the accomplished executives and entrepreneurs who participate in Harvard’s flagship programs for business owners and leaders.
Shifting the center of my teaching to executive education changed the way I teach and write about strategy. I’ve been struck by how often executives, even experienced ones, get tripped up: they become so interested in the potential of new ventures, for example, that they underestimate harsh competitive realities or overlook how interrelated strategy and execution are. I’ve also learned, in conversations between class sessions (as well as in my work as a board director and corporate adviser) about the limits of analysis, the importance of being ready to reinvent a business, and the ongoing responsibility of leading strategy.
All of this learning speaks to the role of the strategist—as a meaning maker for companies, as a voice of reason, and as an operator. The richness of these roles, and their deep interconnections, underscore the fact that strategy is much more than a detached analytical exercise. Analysis has merit, to be sure, but it will never make strategy the vibrant core that animates everything a company is and does.
The strategist as meaning maker
I’ve taken to asking executives to list three words that come to mind when they hear the word strategy. Collectively, they have produced 109 words, frequently giving top billing to plan, direction, and competitive advantage. In more than 2,000 responses, only 2 had anything to do with people: one said leadership, another visionary. No one has ever mentioned strategist.
Downplaying the link between a leader and a strategy, or failing to recognize it at all, is a dangerous oversight that I tried to start remedying in a Harvard Business Review article four years ago and in my new book, The Strategist, whose thinking this article extends.1 After all, defining what an organization will be, and why and to whom that will matter, is at the heart of a leader’s role. Those who hope to sustain a strategic perspective must be ready to confront this basic challenge. It is perhaps easiest to see in single-business companies serving well-defined markets and building business models suited to particular competitive contexts. I know from experience, though, that the challenge is equally relevant at the top of diversified multinationals.
What is it, after all, that makes the whole of a company greater than the sum of its parts—and how do its systems and processes add value to the businesses within the fold? Nobel laureate Ronald Coase posed the problem this way: “The question which arises is whether it is possible to study the forces which determine the size of the firm. Why does the entrepreneur not organize one less transaction or one more?”2 These are largely the same questions: are the extra layers what justifies the existence of this complex firm? If so, why can’t the market take care of such transactions on its own? If there’s more to a company’s story, what is it, really?
In the last three decades, as strategy has moved to become a science, we have allowed these fundamental questions to slip away. We need to bring them back. It is the leader—the strategist as meaning maker—who must make the vital choices that determine a company’s very identity, who says, “This is our purpose, not that. This is who we will be. This is why our customers and clients will prefer a world with us rather than without us.” Others, inside and outside a company, will contribute in meaningful ways, but in the end it is the leader who bears responsibility for the choices that are made and indeed for the fact that choices are made at all.
The strategist as voice of reason
Bold, visionary leaders who have the confidence to take their companies in exciting new directions are widely admired—and confidence is a key part of strategy and leadership. But confidence can balloon into overconfidence, which seems to come naturally to many successful entrepreneurs and senior managers who see themselves as action-oriented problem solvers.3
I see overconfidence in senior executives in class when I ask them to weigh the pros and cons of entering the furniture-manufacturing business. Over the years, a number of highly regarded, well-run companies—including Beatrice Foods, Burlington Industries, Champion, Consolidated Foods, General Housewares, Gulf + Western, Intermark, Ludlow, Masco, Mead, and Scott Paper—have tried to find fortune in the business, which traditionally has been characterized by high transportation costs, low productivity, eroding prices, slow growth, and low returns. It’s also been highly fragmented. In the mid-1980s, for example, more than 2,500 manufacturers competed, with 80 percent of sales coming from the biggest 400 of them. Substitutes abound, and there is a lot of competition for the customer’s dollar. Competitors quickly knock off innovations and new designs, and the industry is riddled with inefficiencies, extreme product variety, and long lead times that frustrate customers. Consumer research shows that many adults can’t name a single furniture brand. The industry does little advertising.
By at least a two-to-one margin, the senior executives in my classes typically are energized, not intimidated, by these challenges. Most argue, in effect, that where there’s challenge there’s opportunity. If it were an easy business, they say, someone else would already have seized the opportunity; this is a chance to bring money, sophistication, and discipline to a fragmented, unsophisticated, and chaotic industry. As the list above shows, my students are far from alone: with great expectations and high hopes of success, a number of well-managed companies over the years have jumped in with the intention of reshaping the industry through the infusion of professional management.
All those companies, though, have since left the business—providing an important reminder that the competitive forces at work in your industry determine some (and perhaps much) of your company’s performance. These competitive forces are beyond the control of most individual companies and their managers. They’re what you inherit, a reality you have to deal with. It’s not that a company can never change them, but in most cases that’s very difficult to do. The strategist must understand such forces, how they affect the playing field where competition takes place, and the likelihood that his or her plan has what it takes to flourish in those circumstances. Crucial, of course, is having a difference that matters in the industry. In furniture— an industry ruled more by fashion than function—it’s extremely challenging to uncover an advantage strong enough to counter the gravitational pull of the industry’s unattractive competitive forces. IKEA did it, but not by disregarding industry forces; rather, the company created a new niche for itself and brought a new economic model to the furniture industry.
A leader must serve as a voice of reason when a bold strategy to reshape an industry’s forces actually reflects indifference to them. Time and again, I’ve seen division heads, group heads, and even chief executives dutifully acknowledge competitive forces, make a few high-level comments, and then quickly move on to lay out their plans—without ever squarely confronting the implications of the forces they’ve just noted. Strategic planning has become more of a “check the box” exercise than a brutally frank and open confrontation of the facts.
The strategist as operator
A great strategy, in short, is not a dream or a lofty idea, but rather the bridge between the economics of a market, the ideas at the core of a business, and action. To be sound, that bridge must rest on a foundation of clarity and realism, and it also needs a real operating sensibility. Every year, early in the term, someone in class always wants to engage the group in a discussion about what’s more important: strategy or execution. In my view, this is a false dichotomy and a wrongheaded debate that the students themselves have to resolve, and I let them have a go at it.
I always bring that discussion up again at the end of the course, when we talk about Domenico De Sole’s tenure at Italian fashion eminence Gucci Group.4 De Sole, a tax attorney, was tapped for the company’s top job in 1995, following years of plummeting sales and mounting losses in the aftermath of unbridled licensing that had plastered Gucci’s name and distinctive red-and-green logo on everything from sneakers to packs of playing cards to whiskey—in fact, on 22,000 different products—making Gucci a “cheapened and over-exposed brand.”
De Sole started by summoning every Gucci manager worldwide to a meeting in Florence. Instead of telling managers what he thought Gucci should be, De Sole asked them to look closely at the business and tell him what was selling and what wasn’t. He wanted to tackle the question “not by philosophy, but by data”—bringing strategy in line with experience rather than relying on intuition. The data were eye opening. Some of Gucci’s greatest recent successes had come from its few trendier, seasonal fashion items, and the traditional customer—the woman who cherished style, not fashion, and who wanted a classic item she would buy once and keep for a lifetime—had not come back to Gucci.
De Sole and his team, especially lead designer Tom Ford, weighed the evidence and concluded that they would follow the data and position the company in the upper middle of the designer market: luxury aimed at the masses. To complement its leather goods, Ford designed original, trendy—and, above all, exciting—ready-to-wear clothing each year, not as the company’s mainstay, but as its draw. The increased focus on fashion would help the world forget all those counterfeit bags and the Gucci toilet paper. It would propel the company toward a new brand identity, generating the kind of excitement that would bring new customers into Gucci stores, where they would also buy high-margin handbags and accessories. To support the new fashion and brand strategies, De Sole and his team doubled advertising spending, modernized stores, and upgraded customer support. Unseen but no less important to the strategy’s success was Gucci’s supply chain. De Sole personally drove the back roads of Tuscany to pick the best 25 suppliers, and the company provided them with financial and technical support while simultaneously boosting the efficiency of its logistics. Costs fell and flexibility rose.
In effect, everything De Sole and Ford did—in design, product lineup, pricing, marketing, distribution, manufacturing, and logistics, not to mention organizational culture and management—was tightly coordinated, internally consistent, and interlocking. This was a system of resources and activities that worked together and reinforced each other, all aimed at producing products that were fashion forward, high quality, and good value.
It is easy to see the beauty of such a system of value creation once it’s constructed, but constructing it isn’t often an easy or a beautiful process. The decisions embedded in such systems are often gutsy choices. For every moving part in the Gucci universe, De Sole faced a strictly binary decision: either it advanced the cause of fashion-forwardness, high quality, and good value—or it did not and was rebuilt. Strategists call such choices identity-conferring commitments. They are central to what an organization is or wants to be and reflect what it stands for.
When I ask executives at the end of this class, “Where does strategy end and execution begin?” there isn’t a clear answer—and that’s as it should be. What could be more desirable than a well-conceived strategy that flows without a ripple into execution? Yet I know from working with thousands of organizations just how rare it is to find a carefully honed system that really delivers. You and every leader of a company must ask yourself whether you have one—and if you don’t, take the responsibility to build it. The only way a company will deliver on its promises, in short, is if its strategists can think like operators.
A never-ending task
Achieving and maintaining strategic momentum is a challenge that confronts an organization and its leader every day of their entwined existence. It’s a challenge that involves multiple choices over time—and, on occasion, one or two big choices. Very rare is the leader who will not, at some point in his or her career, have to overhaul a company’s strategy in perhaps dramatic ways. Sometimes, facing that inevitability brings moments of epiphany: “eureka” flashes of insight that ignite dazzling new ways of thinking about an enterprise, its purpose, its potential. I have witnessed some of these moments as managers reconceptualized what their organizations do and are capable of doing. These episodes are inspiring—and can become catalytic.
At other times, facing an overhaul can be wrenching, particularly if a company has a set of complex businesses that need to be taken apart or a purpose that has run its course. More than one CEO—men and women coming to grips with what their organizations are and what they want them to become—has described this challenge as an intense personal struggle, often the toughest thing they’ve done.
Yet those same people often say that the experience was one of the most rewarding of their whole lives. It can be profoundly liberating as a kind of corporate rebirth or creation. One CEO described his own experience: “I love our business, our people, the challenges, the fact that other people get deep benefits from what we sell,” he said. “Even so, in the coming years I can see that we will need to go in a new direction, and that will mean selling off parts of the business. The market has gotten too competitive, and we don’t make the margins we used to.” He winced as he admitted this. Then he lowered his voice and added something surprising. “At a fundamental level, though, it’s changes like this that keep us fresh and keep me going. While it can be painful when it happens, in the long run I wouldn’t want to lead a company that didn’t reinvent itself.”

About the Author
Cynthia Montgomery is the Timken Professor of Business Administration at Harvard Business School, where she’s been on the faculty for 20 years, and past chair of the school’s Strategy Unit.

Elements of this article were adapted from Cynthia Montgomery’s The Strategist: Be the Leader Your Business Needs (New York, NY: HarperCollins, 2012).

Thursday, March 22, 2012

Capital Planning and Investment Control - What's Next?


It has been over 10 years since the Clinger-Cohen Act, Capital Planning and Investment Control (CPIC) established a new way to manage IT programs.  I am not a CPIC expert so I give much credit to those who have been working to improve how government agencies monitor cost, schedules and performance of IT projects.  It has been 10 years... so are things continuing to improve or is the community at a crossroads?

What is the next big step for the CPIC community? 

Having attended the last two CPIC conferences in the DC area, with the most recent one just this week... it appears that a challenge has been issued for the community to step out of a data gathering and compliance mode and enter into a new level of relevance, importance, value to your organizations and to the ultimate customer... the U.S. Citizen. 

Repeated calls were heard to step out of a purely compliance focus and step into the executive board room, making your input, analysis and recommendations relevant and heard.

Again, I am not claiming to be an expert in the community, however, I do know a thing or two about organizational processes and change management... and I can appreciate the difficulty that this challenge presents to those who have limited resources and organizational pressure to meet data collection and input time lines, especially for "pet" projects.  However, my instinct tells me that this challenge can be met.

It is easy to slip into a mode which is entirely focused on meeting or complying with reporting requirements... and I know there are often road blocks that are hard to maneuver around. So there is a personal risk that may come into play when you step out of your box and attempt to provide some meaningful analysis and recommendations developed from your years of experience and the data you have gathered.

Of course my fall back is always metrics... and the metrics which provide you the ability to take this step are more than you may be required to capture now.  But I contend that a higher level of integrated metrics are what provide the true value to what the CPIC community is trying to do. 

How do you integrate CPIC, enterprise architecture, program management and strategic management or governance activities?  I don't pretend to have the answer to this question for everyone. However, I would suggest it requires a different look, a different view, an integrated view.  Let's explore how to look at these domains within the box, clearly seeing them for what they are, but elevating our analysis to an integrated view which is truly outside of the box... providing a new level of insight, business information, and strategic thinking that transcends the individual compliant focused efforts locked up within the box.

Some agencies are starting to see the value of this level of analysis as budgets are getting tighter and tough decision must be made. What is needed, to promote this concept and leverage the expertise across the industry, is a forum and framework to faciliate and drive this effort. I don't think the right one exists at this point.  And this is worth some thought, as we continue to move forward with an increasing integration of business information needs and new technology capabilities. 

Friday, July 22, 2011

Information Governance with SharePoint

I recently attended a seminar on Information Governance and until then I had not really considered what "Information Governance" was all about, much less that it was anything significant.  I now think there is something to this concept.

Having worked on several projects that focused primarily on records or case folder management and having my eyes opened to the real problems that most government agencies have with how they manage their records, I have a growing understanding and respect for the industry of records management.  I have heard from more than one records management professional that "Records Management is not rocket science". However, there are some organizational issues that create problems trying to implement a sound and compliant records management program. The main problem being that no one seems to care enough about records management. Unfortunately little attention is paid to this area, with everyone sharing the responsibility to perform records management but few realizing the importance of doing so. And, for those of us who have been around long enough to remember the office admin assistant (secretary), many of them are not there anymore to keep us straight and our records filed correctly. Some organizations don't think about records management seriously until  they are being sued for something they did or didn't do, and now they can't find documents to support what was done.  Too Late!!!

The seminar I attended focused on records management capabilities of an application that works with SharePoint 2010, but the epiphany that I experienced had more to do with the emerging importance of Information Governance.

As it was explained, information governance is the convergence of Records Management, Search or eDiscovery and Storage Management. As these functions overlap the importance of reducing records management resources, easing access to information and saving storage costs creates a focus on the information within the framework and the ability to govern this information.

So what does this have to do with SharePoint?  The popularity of SharePoint across the Government is phenomenal. About 3 out of 4 agencies are using SharePoint to some degree. And as many of you have probably heard, there are just about as many horror stories on what went wrong with the implementation.  One of the benefits of SharePoint is the tremendous capabilities that it can provide to the user to create websites, share and store documents like never before. However, therein lies the problem for many agencies. Careful thought and planning must occur prior to implementation and a vigorous governance framework must lead the way. Otherwise existing information governance issues become exaggerated and spiral out of control.

Back to records management for a moment. An idea frequently espoused by records management professionals is the idea that good records management is all about the content, not the container. This emphasizes the importance of the information within the record and not the container which holds the information. With this in mind, think of what is going on with electronic information in particular. The amount of information is growing exponentially and the access and storage of this information is also expanding in new ways ranging from social media to cloud computing. This ever increasing amount of information from various sources creates a demand for agency-wide collaboration tools like SharePoint.  So how do you address the implementation of an agency-wide collaboration tool like SharePoint?

Common sense tells me that first you must find out what you don't know. Get a basic understanding of what information you have and the various types of containers the information is in.  This may not be a simple task. Any agency which has been around for more than a couple of years will have an amazing amount of information and it may be in dozens of formats. And also typical for many organization, much of this information is not known by the agency as a whole, but known only to a few within a business unit. In my opinion there are two primary reasons for this lack of agency knowledge.

First, records management has not been a priority for the organization. Records inventories are not conducted and file plans either do not exist or are not followed. This leads to the development of records and information that is not tracked and managed.  This creates a "You don't know what you don't know" situation.

Second, the agency CIO either is not fully empowered or does not have the resources to manage the "I" in CIO... that is Information. Probably, the main focus has been on systems and the infrastructure supporting the business. However, the focus on knowing the organizations information, what is important, who creates and owns the information etc... is not as strong a focus as managing all the technology equipment.  Again, this creates two problems. One being that information is not known, managed or reliable. Two, shadow IT flourishes, because systems are not managing and supporting the information suitable to the user. Faced with the lack of systems support, the business units find a way to develop there own "work-around" or in some agencies they are funded to develop there own systems. And the cycle continues to grow and spiral out of control.

Information governance combined with a powerful collaboration tool like SharePoint can help manage your information. Without governance these collaboration tools can be the catalyst to fertilize existing problems. Information governance can drive and control some basics for your implementation planning.  Helping the agency answer questions; How much and what type of content do you have? Who will be using the content and what compliance requirements or regulations must you consider?



I would like to hear your thoughts on "Information Governance"

      

Tuesday, May 3, 2011

Lean Six Sigma and the Evolution of Software Development

Software development processes have evolved significantly in recent years, shifting from traditional software development lifecycles (SDLCS) to more flexible or “Agile” development approaches. Although many new development models exist, the iterative and incremental development techniques all of these models introduce focus on a shift away from the “Waterfall” methodology, which dominated the industry for years. Agile software development methodologies are based on an iterative and incremental development approach, where requirements and technical solutions evolve through extensive collaboration between business owners and cross-functional teams. So, regardless of the Agile process model name (i.e. Agile Unified Process, Spiral process, Rational Unified Process, Rapid Application Development, Cleanroom development and Scrum), the fundamental tenant of each is iterative and incremental development.


Although waterfall methodologies have received a lot of criticism, there are some instances where it may be an effective development approach. These instances may include development within stable environments, including maintenance releases or existing products or porting an existing product to a new platform. However, absent these factors, an iterative development approach will more likely provide the flexibility and communication needed to enhance project success.

Current best practices in software development management stem from three key drivers. First, development processes are driven by short-term milestones. By focusing on short-term (usually about a month) goals, a laser focus on requirements and an improved alignment of business and technology can be obtained. Second, development processes provide teams some flexibility in approach. Although, the development process is well-defined, flexibility within the development teams to solve problems and create solutions is provided, allowing for creative and innovative solutions. And third, communication and learning within the development process are being used as a catalyst to spawn change and institute new work practices throughout the organization. As the development team(s) work with business owners to create solutions, the communication and learning between business owners and with technology professionals lead to innovative improvements that are being shared across organizations.

Professional organizations for standards and frameworks like CMMI, PMBOK, Lean Six Sigma and ISO have introduced additional models and techniques that can be leveraged to improve agile development cycles. PMBOK, for example, has become the industry standard for defining management practices and processes. As a starting point for building a development team, trained and experienced project managers are a necessity with PMI certification validating an understanding of the industry best practices for project management.

The Capability Maturity Model Integration (CMMI) for Development is an integration of best practices that provides a single framework to assess development and maintenance processes and improve performance. CMMI certification has also become a standard for software development. Although many development teams/shops strive to obtain a CMMI certification, many of the tools can be used without going through the extensive certification process.

Lean Development has evolved from Six Sigma techniques, which focus on the improvement of process by eliminating waste. Within the context of software development, these process wastes are defined as: 1.) unnecessary coding and functionality development, 2.) delays in the development cycle, 3.) unclear requirements, 4.) bureaucracy and 5.) slow communication. By recognizing these “waste” areas and eliminating or improving them, performance dramatically improves, adding value to your solutions and the business operations. Incorporating lean principles and conducting a value assessment to remove non-value added steps and processes, helps keep the development cycle lean and high performing.

So Lean Six Sigma techniques are well aligned with recent trends is software development. Lean Six Sigma provides a new way at looking at your development processes, with a strong focus on how to improve the process, which by default leads to faster, better and often cheaper solutions. 

Monday, March 28, 2011

Giving Technology the Business!

Giving someone the business is a jargon you have probably heard. It means giving someone trouble, or a roughing up, or beating.  Giving someone the business can mean you are trying to make a point and willing to go to a lot of trouble to make it. So what's my point. What do I mean about giving technology the business?  First, I am not condoning beating up your techie friends. However, what I am suggesting is to take a look at your business without a technology focus for a change. Instead of focusing on how technology is used to support your business lets take a look at the business to see how the process is working, not how your technology is working.

Are you experiencing bottelnecks, backlogs, long wait times between your process steps? Once you identify exactly where the problem areas are occurring, you can begin to focus on why they are occurring. After you have identified the problem areas and why they are occurring you can begin to ask if these problems are due to poorly defined business process requirements or poorly designed technology systems? 

A common six sigma tool to assess processes is called the Value Stream Map. In a manufacturing production line the Value Stream Map is often used to identify backlogs, delays, and overproduction which cost money and create waste. It is quick and easy to see where production delays are occurring and to focus on specific ways to improve your production time. But how does a Value Stream Map help with a business process where backlogs, delays and overproduction are not so obvious. It can be done, but we don't usually look at business processes this way. The following link provides some examples of Value Stream Maps from eVSM software. http://www.evsm.com/examples.htm

There are a couple of techniques in developing the Value Stream Map that help us look at a business process in a new way. One technique is mapping the process from the end point backward. Most of you process modeling purest out there may question this approach, but give it a chance. Start by looking at the end product, service or output. What is the end point of the process? From that point, start working backward by looking at the inputs to the last step. By starting at the end and moving backward to the beginning of the process, you can more easily determine which process steps were waiting for input. Maybe the process was not only waiting for input, but routinely receives the wrong input, or too much input. As you work your way backward through the process, document these areas where time is lost.

From a business perspective these are the delays, backlogs and production errors that are much more visible in a manufacturing environment, but routinely exist in many business processes. Sometimes you may find that a technology system or application forces an action that is not adding value to the process or may be causing a delay or unnecessary step. When system requirements get out of sync with business processes, sometimes the business is expected to adapt to the system despite the inefficiencies created. Reducing this impact is what I refer to as a high integration of business and technology.  High integration of business and technology minimizes the negative effects of technology on business processes.

Can or should the business process change to leverage the benefits of a new technology?  This hits at the root of most problems when implementing new technology. And, my opinion is that most of the problems with implementing new technology are due to business problems not technology limitations. More often than not, technology capabilities can support business requirements.  However, we often muddle things up by confusing and co-mingling discussion of business requirements and technology capabilities, losing focus on the business need and conceding change of the business to conform to existing technology limitations. Sometime this may not be avoidable, but let's not default to this way of thinking. Let's keep giving technology the business.  

I am not sure how many organizations use a Value Stream Map or similar tool to develop a fresh perspective on business processes. I think there are not many, and I suggest there should be many more. At worst it provides a clearer picture of your processes, at best it highlights the specific areas where you can reduce wasted time or process steps.  And maybe it will help define areas where your technology can better serve your business process.   


 
 

Wednesday, March 23, 2011

Six Sigma to the Rescue?

I just completed six months with Villanova's online Six Sigma Certificate Program. To be specific, the Lean Six Sigma Black Belt Certificate program. I must say that it is a great set of courses that require more than a casual effort to complete. The combination of online virtual classroom lectures, CDs with hours and hours of presentations from industry leaders and the required project work, provides a great learning opportunity for those of us who are interested in that sort of thing.

This of course begs the question why? What does this have to do with business and technology integration? Six Sigma has been around for several decades and has primarily been focused on the manufacuturing industry, which limits its application for most of us who have never worked on a production line. However, the "Lean" focus is really about expanding the use of Six Sigma tools beyond the realm of manufacturing to the business service environment. So my plans over the next few months is to explore this idea of using Six Sigma to solve business problems, which includes technology problems since technology is a service to the business.

I welcome input for areas to explore, but my recent experiences I believe will provide more than enough foder for comment for the next few months.

For now, I will leave you with the a few basic points relative to Lean Six Sigma. First, the primary focus is on reducing waste in your existing processes. Waste can be recognized in many shapes and forms. Basically, if something is happening in your process that is not adding value to the service you are providing then there is a potential for waste... and therefor there is a potential to eliminate or drastically reduce it, and by default improve business processes and the bottom line.

Secondly, what do I mean be adding value...? Very simple... to add value to your service you must physically change something in a way that the customer is willing to pay for and it must be done right the first time, no rework allowed. That is the only way to add value.

So I will leave on that note for now. More to come soon.

Wednesday, January 20, 2010

Services To The Rescue A Look Inside The Services Available For Today’s Data Centers

Processor Magazine
June 27, 2008 • Vol.30 Issue 26Page(s) 9 in print issue

There exists a dilemma of mounting proportions in data centers at many small to midsized enterprises. As companies realize the benefits of blending business and IT objectives to further the overall success of the business itself, IT managers are being perpetually pulled away from data center tasks that they shouldn’t ignore. Budget restraints at many companies prevent the hiring of additional personnel to solve this problem, but service providers can offer an effective alternative. The following services can not only remove data center responsibilities from the shoulders of IT managers, but they can even introduce expertise and value into the overall data center equation. Data Services The IT staffs in many SMEs can capably handle a wide range of technological demands, but as businesses grow, the demands can put a hefty strain on existing personnel. This strain becomes particularly evident around data-related tasks such as electronic data storage, data backup, records management, document imaging, and document destruction. “Data-related service providers offer an effective alternative,” says Nicholas Kottyan, CEO of DataChambers (http://www.datachambers.com/). “For example, outsourcing routine data backups can ensure that business-critical information is protected on a regular basis, even if a team member is sick, on vacation, or decides to leave the company. You can free up valuable staff time for more important assignments while maintaining continuity in your operations.” Using outside data services can also provide integral support to a company’s growth strategy, Kottyan says, because they provide it with a flexible method for managing peaks and valleys in the demand for IT support until managers can establish a firm trend line and determine whether it makes good business sense to add more employees. “Many SMBs outsource a portion of their operations in order to complement their own internal IT team. You might need extra arms and legs, 24/7 coverage, or a way to redeploy your internal resources to more critical tasks. You might want to store backups at a remote site, establish hot desks for disaster recovery, devote dedicated resources to hotline support, or find a more cost-effective way to maintain desktop computers,” Kottyan says. “Service providers can tailor support plans to meet your precise requirements and do so for a predictable monthly fee.” Power Services As the focus on green IT continues to grow, data center managers are placing a heavier emphasis on ensuring their power systems are correctly configured. For SMEs, this can entail tasks as simple as power testing to more advanced undertakings, such as CFD (computational fluid dynamics) modeling and full data center power assessments. This is weighty territory that’s often best left to power experts who can provide these services on a one-time or even regular basis. “Regular power testing helps you identify potential problems before they impact the bottom line,” says Omar McKee, manager of service solutions for Emerson Network Power’s Liebert Service (http://www.liebert.com/). "Too often, data center managers aren't aware of an issue with the power system until the power goes out, and that's an expensive alarm system. . . . As equipment ages and the IT network evolves, new problems will present themselves. Continued testing—even when it seems unnecessary—enables you to be prepared when those problems arise.” Power services are available for data centers of any size and can be as simple as the installation and continued maintenance of a UPS or the maintenance, monitoring, and replacement of batteries. More advanced services provide availability assessment, which can entail safety compliance, electrical infrastructure examination, power quality testing, and fuse and breaker examination. Power service providers also often handle cooling assessments and air-conditioning maintenance.

Planning, Design & Quality Assurance Although SMEs might have the technical personnel to run an IT organization, planning and designing data centers and other IT infrastructure brings a set of challenges that often require outside expertise. Increasingly, the data center is tied into the business, and according to Dennis Lasley, vice president of Accent Global System Architects (http://www.accentglobal-llc.com/), the business is an important and often overlooked stakeholder. “Often, in-house experts, although they may be fully capable, are tied to organizational norms that may preclude a ‘fresh’ review of the business requirements,” Lasley says. “Service providers specializing in closing the business-to-technology gap can usually provide an unbiased and unfiltered review of business operations and the supporting system infrastructure.” Planning and design services can help with large or small projects and address a wide range of issues, including site selection, feasibility studies, architectural design, raised floor design, availability and risk assessments, and others. Many companies also offer quality assurance services, which help to ensure new and existing designs are running at efficient, effective levels. Cleaning, Maintenance & Repairs Most of the hype surrounding data centers entails the technology that resides within them, but without proper physical support for that technology, a data center can encounter big problems in a hurry. SMEs that are already strapped for bodies to run the data centers can’t afford specialized personnel to clean, maintain, and repair the equipment and infrastructure. “General maintenance tasks can be performed internally, but tasks that are specific to a particular system, such as the power or HVAC systems, may be best left to personnel certified in maintaining those systems,” notes Kris Domich, principal consultant of data center and storage solutions at Dimension Data (http://www.dimensiondata.com/). “Some of the most innocent attempts to clean sensitive components can result in damage, injury, or a voided warranty.” Service providers not only have personnel specifically trained to carefully clean expensive, sensitive equipment and the areas around it, but they also have equipment built for the job, including antistatic tools and chemicals. Cleaning and maintenance services are available for all areas of the data center, including raised floors and ceilings. Repair services are also available for all types of data center equipment, but not all providers service all equipment types. Depending on the type and scope of the repair, companies may opt to solicit bids for certain repair jobs. Security Advances in technology can bring a bounty of benefits to businesses and their bottom lines, but they also inevitably introduce more security threats that can be difficult to monitor and prevent. Chris Richter, vice president and general manager of security services at SAVVIS (http://www.savvis.net/), explains that SMEs often find that properly managing their security involves making a significant investment in capital and personnel. “Among other security best practices, separation of duties requires separate job functions for the management of IT security and other IT tasks. For example, it is not considered a best practice for a database administrator to also be responsible for management of firewall rules,” Richter says. “Outsourcing security functions to a managed security services provider can help SMBs achieve separation-of-duty practices and growth without adding personal costs.” Managed security service providers can address all of an SME’s security needs or just tackle particular areas, such as firewall management, intrusion prevention and detection, log management, vulnerability assessment, wireless security, and others. Some providers also offer physical security services, which can be useful for companies looking to protect resources around the clock.

by Christian Perry