GameStop Case Study – Dynamic Capabilities

A global, specialty retailer of video game products, consumer electronics and wireless services, GameStop traces its roots to Babbage’s, a small software retailer that started in Dallas, Texas. The metamorphosis from Babbage’s into GameStop began with a series of mergers. The first was with Software Etc., a deal in which Babbage’s Etc. LLC was formed. In 1999, Babbage’s Etc. was sold to Barnes & Noble. The next merger was with Funco, Inc., which was acquired by Barnes & Noble in June 2000. Following this acquisition, Babbage’s Etc became a wholly-owned subsidiary of Funco, also the owner and publisher of video game industry magazine Game Informer which began publication in 1991. For the curious minded, Game Informer is now the 3rd largest magazine in circulation in the US!

At the end of 2000, Funco changed its name to GameStop, and in 2002 the company completed its initial public offering on the New York Stock Exchange. In late 2004, GameStop spun off from Barnes & Noble with its buy back of six million shares of stock from the book seller. GameStop then proceeded to acquire Electronics Boutique (EB Games) in 2005. In January 2007, GameStop purchased Rhino Video Games from Blockbuster.

GameStop has since sensed and seen how digital content has rapidly transformed the industry. It has taken some steps to innovating in this area to stay relevant to the consumer. Investment dollars have gone to both acquisitions and strategic partnerships in an attempt to diversify their revenue streams as well as begin to build out digital competencies. These have had mixed results so far.

GameStop will need to transform the company and business model to stay relevant in the digital video game sector. We recommend GameStop practice what David Teece refers to as ‘corporate ambidexterity’: continuing to cultivate and broaden its successful retail business, while simultaneously investing in emerging streaming technology. In many ways this approach follows NetFlix’s successful transition strategy from physical DVDs to digital streaming.

Industry Analysis

For idiosyncratic reasons the video game retailing market has been growing steadily over the last decade while books, music, movies, and rental retailing have all been decimated by digital disruptions. Analysts expect the digital shift in games to accelerate rapidly in the coming years. The games industry is slowly transforming to digital, similar to how the music, movies, and books industries did over the last decade.

The global video game market as a whole is forecasted to grow 11% over the next four years to $104BN by 2017, but due mostly to gains in tablet, mobile, and digital sales. GameStop primarily sells physical games for home consoles which is rapidly contracting in both unit sales and dollars. Physical game sales in North America, as a percentage of the market, are expected to drop from 88% in 2013, to a 50% split with digital games by 2018.

History

GameStop is a specialty retailer of new and used video game systems, games, and peripherals with nearly 6,700 stores, mostly in North America. Last year the company reported over $9BN in sales and $377MM in net income. GameStop’s traditional competitors have been other retailers such as Walmart, Best Buy, and Amazon.com.

GameStop has developed a competitive advantage and strong competencies in operating small stores (avg. 1,400 sq. ft) with high inventory turns, highly knowledgeable employees, high customer loyalty (25MM program members, 75% of sales), and through a major focus on used products (which account for 40% of gross profits).

However, now GameStop is facing a new set of digital competitors. The hardware platforms themselves (Microsoft Xbox, Sony PlayStation, and Nintendo Wii) have setup walled-garden ecosystems and sell games digitally direct to consumers. Consumers no longer need to go a physical store, unless they prefer a physical copy so that they can resell it or lend it to friends. According to Mintel research, 31% of the consumer group engaged in trade-ins during the last three months and 22% engaged in borrowing. Digitally purchased games cannot be transferred to other people. This is akin to the issues facing the book retailing industry.

GameStop Analysis

The shift to digital impacts business at GameStop. In its latest annual report, GameStop celebrated $1BN in digital sales, creating the impression that they’ve made progress. However, 98% of company revenue is still going through their brick-and-mortar stores. GameStop stores are simply selling codes that enable the customer to download a game direct to a digital device. This is similar to buying a mobile phone card at Target, rather than buying direct from the mobile carrier. This is smart utilization of current resources in the short-term, but cannot be considered transformational in the long-run since it is inefficient to sell this way, and volumes will not be high enough to sustain retail stores once games become entirely digital products. This is evocative of strategic inertia; GameStop is looking at the digital world through their old lens.


Throughout the past 4 years, GameStop invested heavily in growing its capabilities through partnerships and acquisitions. In 2012, acquired BuyMyTronics, an online electronics re-commerce company that has been instrumental in identifying opportunities to expand the company’s device selection and integral to the buy-sell-trade model. I think this acquisition has been successful). In 2010, acquired Kongregate, to tap into and grow its market share in the rapidly growing field of mobile games. Kongregate has published its first nine mobile games, amassing millions of downloads, and is prominently featured within the Apple App Store.

In 2011, acquired Spawn Labs, a developer of technology that allowed users to play video games that were run remotely on machines in data centers rather than their personal computer or console, and Impulse, Inc., a digital distribution company, with the goals of providing new ways to reach new gamers in new locations and on many devices. GameStop closed Spawn Labs in 2014 and it hasn’t built or grown Impulse.

In 2013, acquired Spring Mobile, an authorized AT&T reseller operating over 160 stores selling wireless services and products, and Simply Mac, an authorized Apple reseller selling Apple products and services in 23 stores with the goal of expanding the channels to grow digital sales base, enhancing GameStop’s market leadership position in the electronic game industry and in the digital distribution category.

In 2014, GameStop partnered with Shelfbucks and BestFit Mobile to understand the drivers surrounding the evolution of the retail industry and advance towards Digitization of the physical retail space. GameStop aimed to leverage the partnership with Shelfbucks to deliver a next-generation shopping experience using emerging technologies. In 2014, partnered with Cricket Wireless, an AT&T subsidiary that offers value-conscious consumers with a first-class, pre-paid wireless experience without annual contracts. Adding Cricket’s no-contract option as its exclusive wireless partner makes GameStop “one-stop shop when it comes to affordable phones,” with the goal of seizing tremendous growth opportunities in prepaid digital and online timecards. In 2014, partnered with IBM & Texas A&M University to develop and incorporate new mobile and cloud apps with previously existing systems. In 2014, GameStop partnered with Dynamic Signal to develop and drive traffic through social platforms.

The firm is poised to seize in both its brick and mortar and online presences. It has a strong presence in North America, a somewhat lesser presence in Europe and Australia and completely lacks a presence in Asia and South America. GameStop is positioned to seize and transform in the brick and mortar and online verticals in one of the four existing segments or to venture into new geographies.

Gamestop has recently repaid all of their debt, and had strong positive cash flows from operations. This strong position allows them to pay dividends as well as repurchase stock. As such, GameStop is in an excellent position to invest in the digital transformation as well as expand their presence in brick-and-mortar stores, despite the digitization, and garner a larger share of the international market if they choose this route.

GameStop has 17,000 full-time salaried and hourly workers and anywhere from 27,000 to 52,000 part-time hourly workers, depending on the season – the Thanksgiving and Christmas holiday season being the peak. There are no labor unions in GameStop’s operations in the United States; some international employees have collective bargaining agreements. GameStop has some agility in its people strategy because the majority of its employee population is part-time, seasonal workers and a small portion of its labor force is unionized. The experience of GameStop’s senior leadership is weighted toward the electronics retail, clothing retail, consumer packaged goods, and health sciences industries. None of these industries have experienced the magnitude of change the consumer electronics industry currently is facing (Exhibit 4.) While GameStop has a dynamic capability due to the flexible structure of its employment base, the employees are not likely to embrace a transformation initiated by senior management. Additionally, the senior management of GameStop does not have significant transformation experience.

If GameStop is to fully utilize its transient competitive advantages it must strive to serve its 25MM loyal customers better. GameStop’s transient competitive advantages include its brand, knowledgeable employees and hardware remanufacturing capability. It is the largest refurbisher and recycler of electronics in the world. The devices are refurbished or remanufactured in an 182,000 square foot facility, the Refurbishment Operations Center, located just 2 miles away from the headquarters and employing 1,100 people. GameStop’s competitive advantages include its strong skill set around buy/sell/trade, says CEO, Paul Raines. It needs to take this skill set and create a new business model for online streaming since that is where the market is headed in 5 years. Towards achieving these goals, GameStop introduced a new IT structure in 2013 comprising four strategic areas- IT Strategy, Architecture Design, IT Delivery and the GameStop Technology Institute (‘GTI’). Each area is designed to accelerate the identification and adoption of new capabilities and solutions to address the ever-changing IT challenges impacting business today. The GTI is focused on creating affiliations with leading technology corporations and academic institutions to deliver technology solutions to consumers. It has recently entered into strategic partnerships with Shelfbucks, BestFit Mobile and Dynamic Signal to extend its stated focus within its store environments. These are examples of structural transformations GameStop has undertaken to build on process and technology innovation to drive profitability in the technology-driven online marketplace of the future.

 

Credits: Geoff, Tanya, Wayne, Dovic, Mike.

Posted in Strategy | Tagged , , , | Leave a comment

Google’s Mobile Strategy

In this post I want to discuss Google’s expected outcome given its Android strategy. Apple released the iPhone in 2007 and quickly gained leadership in the mobile phone market. Google joined in the Open Handset Alliance, a consortium of leading device manufacturers, wireless carriers and chipset makers, and released Android later the same year to compete with Apple on a more direct footing. Google hoped Android would help it move from the desktop to mobile and allow it to put services in the hands of users where they can be monetized relatively easily.

Google’s Android strategy was providing a unique and positive mobile Internet experience. It bundled application and browser assets with Android to support this user experience: Maps, YouTube and Chrome. It contended that Android would lead the charge in generating new mobile search revenue for Google, which at the time was monetizing search on approximately 200M desktop PCs and Macs[1]. With over 2 billion phones sold annually, the upside was rewarding enough for Google to open-source Android and encourages developers to build applications on the platform.

Google’s mobile vision from 2007-2012 could be stated simply thus: “get all users and businesses in the world to use Google apps and services, encouraging them to spend time on money-making products”. The core of this strategy was the Google platform, which refers to the computer and mobile software and the hardware resources needed to provide services, and hinged on the following key components:

  1. Complementary markets: create complementary markets that compete with other mainstream markets such as OS (Android, Chrome OS), Browser (Chrome), Device (Chromebook) and Network (with Google Fiber and Google Wi-fi with Hotspot) ensuring more and more people engage with Google
  2. Products: continue to innovate in the area of search products such as AdSense and AdWords.

With Android, Google expected to become the Gateway to the Internet, providing every service an online user could possibly need, and monetize the interaction via search and context-based advertisements.

So how did Google’s mobile strategy fail to achieve Google’s expected outcome? To start with, Google miscalculated how its competitors could adapt Android to suit their needs easily. Consequently, Google’s market power depleted because its strategy failed to anticipate developments in the following two areas of the Android ecosystem:

  1. Customization: Android lends itself to easy customization
  2. Competition: Samsung’s Android ecosystem and share of wallet

Customization

Android, being a Linux based and open sourced OS could be customized by developers and OEMs. This was done to better suit the device it was used on or to layer an alternative user interface on it. Examples include HTC’s Sense and Samsung’s TouchWiz[2].

Walled Gardens

The situation became difficult for Google when competitors customized Android to make it more difficult to use Google’s services. For example, Baidu is the default search engine on 80% of Android phones in China[3]. Samsung also has its own email client and “Internet” browser, and it also now preloads with Samsung Apps, on the Galaxy S5. These customizations make it highly visible to users because they are loaded on the home screen.

Facebook has done the most damage. The user interface of its app for iOS and Android devices creates a single pervasive entity that makes messaging, advertisement display and search features easy to navigate to and use. These customizations certainly reduce traffic to Google’s core services. Facebook has become a “walled garden” into which Google AdSense cannot penetrate either.

These customizations to Android keep the market power with the developers, Samsung and Facebook in this discussion, and less with Google. Yahoo is also gearing towards this direction with its own strategic investments in building a product portfolio that has been strengthened through the recent acquisitions of Tumblr and Summly.

Competition

Samsung

In Q1 2013, Android produced an operating profit of $5.3B globally and Samsung took 94.7% of this profit (Frist & Sullivan). It is possible for Samsung to “fork” Android leaving Google services completely out of the next version of its phones thus leaving Google at an extraordinary disadvantage.

Apple

Google most likely considers Apple to be the primary threat as a platform sponsor today. Apple has done a great job at controlling the mobile value chain: Network -> Device maker -> Operating System -> App Distribution Store -> Primary App -> Services

Apple has been able to keep innovating and vertically integrate into device components (the glass manufacturer GT Advanced), distribution (Apple Store) as well all these years. The main differentiator for Google has been its search capabilities that Apple could potentially match or exceed with Microsoft’s Bing. Also, what Google makes up in ad money by showing targeted ads, either in-app or in the browser via AdWords or AdSense, Apple is more than able to make up for by charging the carriers an up-front fee per iPhone sold.

Google’s biggest problem with Apple might lie with customer loyalty[4]. 31% of Android customers say they might consider switching to the iPhone giving Apple a stickful of Android users in the process. Clearly, Apple consumers are more loyal to the brand than any other competitor, including Samsung[5].

Value Metrics- A Survey

The metrics of relevance in the Mobile Industry are listed with performance numbers from Google and Apple.

  1. Revenue generation: The iOS App Store makes $5.1 million in daily revenue, while Google Play brings in $1.1 million per day[6].
  2. Monetization strategies: Freemium apps as a business model continue to be very successful for a variety of categories and iOS users tend to make more in-app purchases vs. Android.
  3.  Size of the app stores: App Store and Google Play have over 1 million apps although Google Play reached it first in July 2013. The App Store matched that in November 2013[7].
  4. App quality: iOS apps were ranked as higher in quality than Android apps, with an average ranking of 68.53 vs. 63.34[8].
  5. International markets: China strongly drove iOS App Store growth, with exceptional gains in both downloads and revenue. In terms of downloads Google Play gained more strongly over iOS in the Emerging Markets of Brazil, China and India[9].

Other metrics such as market share, category growth, store approval rates do not add to either company’s differentiated position in a big way.

Recommended Changes in Strategy

Google needs to develop a strategy that mitigates the risk of the Android market further fragmenting and ensures the continued growth of its core advertising business. The following strategy would ensure market power remains with Google.

  1. Vertically integrate into operating systems, a range of devices and
  2. Manage the end-to-end user experience of an Android device (Marginalize the OS)

This strategy would enable Google to achieve the following goals:

  1. Capture market share from leading laptop vendors in the price sensitive and lightweight-use segments
  2. Enter new markets globally with help from OEMs and also build partnerships with local retailers
  3. Recover the market power it lost when it released Android to the developer market

Vertical Integration

Google’s move into device OS and hardware at the outset looks like a defensive strategy in that Google does not derive significant monetary benefits from the sale of either the OS or the hardware. It has positioned itself at the high end with its Chromebook Pixel and distributes Android freely. By doing so, Google has signaled its intentions of playing ‘platform sponsor’ clearly to other market entrants such as HP, Samsung and Acer.

On the other hand, Google is also playing offense with the introduction of a mobile OS (Android) and a desktop OS (Chrome OS). Hardware manufacturers such as HP, Samsung and Acer carry the Google torch forward by bring their customers newer and sleeker products at much lower price points than the traditional laptop. All this new market energy brings more user activity to Google via interactions these users have with Google apps and services.

This strategy also empowers users with “instant search” by being omnipresent in devices, browsers and laptops. Google has already vertically integrated helping create and distribute the Android, Chrome operating systems, followed by entering into partnerships with OEMs to create the Nexus range of phones and Chromebook devices. Google has created and also entered the laptop complementary market with Pixel, but it has priced it at the high end to not compete with other market entrants such as HP and Samsung.

Marginalize The OS

Google should continue investing in Chrome OS as a browser-based system and make the move to mobile utilizing the synergies between Chrome OS and the Chrome browser. Google can blur the line between the browser and the operating system. It needs to grow the user-based Chrome browser that would handle all user interaction, marginalizing the native operating system. This strategy would directly counter Samsung’s dominance of Android using large-scale customization that block out Google services from the user.

Chrome can also be managed much easier in the market because it is a browser and supports a push-based update mechanism like other browsers do. Updates and new features can be made available faster and hassle-free to users who are always on the go. At this level, Google will match Apple that has been using an iOS push-based update rollout schedule with its devices for years.

Execute!

What is working for Google is that it is still capable of product innovations like Google Glass and the self-driving car, and monetizes over 2 billion views per week with YouTube proving that it can still bring in users to interact with its Platform. Google should continue to execute on the strategies discussed in this Brief if it is to wrest market power away from Apple and Samsung.

[1] “Google ‘Opens’ a New Front in the Mobile Platform Wars”, Frost & Sullivan, 23 Oct 2008

[2] “How do you Solve a Problem like Android?”, Frost & Sullivan, 12 June 2013

[3] Fool “Google has lost control of Android”, http://tinyurl.com/cxfa5r6

[4] “Apple’s iPhone has 89% retention rate…”, Sept. 2011, http://tinyurl.com/pwhjm3s

[5] “Most Valuable Global Brands”, Oct 2013, http://tinyurl.com/m3t3zfo

[6] ReadWrite, http://tinyurl.com/qd8xlsz

[7] Mashable, Oct 2013, http://tinyurl.com/ngv4c9j

[8] ReadWrite, Jan 2013, http://tinyurl.com/p6mf8qc

[9] AppAnnie, Apr 2013

Posted in Strategy, Technology | Tagged , , , | 1 Comment

Strategy in Wireless Technology and Standards

In this post I’ll elaborate on Qualcomm’s strategy to unbundle technology licensing from selling chipsets in contrast to how Intel does it. Intel’s business model is to sell chipsets bundled with technology licenses. In Qualcomm’s case, a mobile device vendor can use chips designed by companies other than Qualcomm which still incorporate licensed Qualcomm technology. Further, this post attempts to analyze the competitive strategies in the wireless technology and standards industry. The 3 key areas of discussion in this brief are licensing strategy, competitive strategies for Qualcomm’s competitors in the mobile chipset market and new market entry with technology and standards.

Qualcomm adopts a business model such that value capture is achieved in two ways:

  1. By outsourcing chipset manufacturing and then subsequently selling the chipsets to device manufacturers. This makes up 63% of Qualcomm’s overall revenue
  2. By IP licensing, which makes up 27% of Qualcomm’s overall revenue. It relies on patents for licensing revenue, which are used to fund its R&D investments for next generation telecommunication technology such as 3G CDMA standards and development of OFDMA-based 4G standards[1].

Strategically it is more valuable to bundle two or more loss-makers and to unbundle two or more profitable products. For Qualcomm, technology licensing and ASICs (chipset) manufacturing could both separately be profitable endeavors. Qualcomm was able to charge CDMA licensees ongoing royalties, and sometimes upfront payments for development support, for the use of its patented technologies (in the realm of about 5% of ASP) in manufacturing and selling CDMA based products.

Qualcomm was selling CDMA licenses and chipsets to the entire value chain. It would sell CDMA licenses to network operators like PacTel, and it would sell chipsets to handset manufacturers like Kyocera, who purchased Qualcomm’s handset division[2].

Another reason why Qualcomm unbundled licensing from selling chipsets was that the major industry bodies- the FCC, CTIA and TIA- would more likely favor CDMA if it was perceived as a fragmented market[3]. Had Qualcomm chosen to sell the chipsets to device vendors that could not use chips designed by companies other than Qualcomm, it might have been perceived as exacting too much control over CDMA rendering the latter a less likely choice for a standard.

While the technology licensing made Qualcomm very good money, with very high margins and continues to do today, royalty costs became a sticky issue with handset manufacturers. Intense competition in a maturing wireless market hurt the manufacturers who decided to pursue non-proprietary standards such as the case with Intel’s PCG that chose the TDMA-based GSM.

Early in its life Qualcomm ensured it retained significant market power, justifying the strategic decision to work with a network of licensees for its technology portfolio while selling chipsets to wireless and phone manufacturer.

By 2004, income before taxes of the division making and selling integrated chipsets and software to both wireless phone and infrastructure manufacturers, QCT (division that manufactures equipment and sells chipsets) comprised over 45% of total income before taxes, a YOY growth of 23.6%[4]. QTL (the technology licensing division) contributed 51.6% of total income before taxes in 2004, a YOY growth of 24.93%. Historical gross margins for QCT and QTL are summarized below:

Qualcomm Margins

Figure 1: QCT & QTL Gross Margin Trend (2003-13)

Notice the decrease in the margins for QCT over the decade. Qualcomm explained that this decrease in QCT gross margins as a percentage of revenues in FY 2012 was primarily due to an increase of 33% in R&D expenses and SG&A expenses, and decrease of gross margins in FY 2013 was due to the net effects of lower average selling prices and unfavorable product mix[5]. However, this drop in margins is not surprising given that the cellular handsets market has grown extremely cost competitive. The table below shows the cumulative decline in Average Selling Price for the worldwide wireless phone industry over a 9-year period[6]:

2000 2001 2002 2003 2004 2005 2006 2007 2008
ASP -5% -12% -17% -23% -23% -25% -27% -28%

Table 1: Cumulative Decline in ASP

Note: Reduced margins resulting from declining ASPs are somewhat offset by the growing mobile user population.

Let us now detail what the implications for companies such as Intel and Broadcom that are attempting to enter and compete in the mobile chipset market. Generally speaking, companies attempting to enter and compete in the mobile chipset market need to identify gaps in technology portfolio and fill those gaps either with in-house manufacturing, inbound OEMs or acquisitions of firms specializing in the production and support of missing technologies. Entering the wireless market almost mandates the following strategic elements be in place first:

  1. Complete Technology Solution: this would ensure the entrant is able to complete on an equal footing with incumbents and other entrants (e.g. Intel vs. TI)
  2. Joint Ventures and Partnerships: the entrant must be able to forge strategic relationships with value chain participants such as with leading handset manufacturers to ensure the selected business strategy ensures profitability and market power (e.g. ODM-to-operator plus partnership with leading handset manufacturers)
  3. Volume Economics: lower per unit costs are achieved using open standards
  4. Emerging markets: partnering with handset manufacturers and data operators to gain market share in China and India that have a combined 1.6B phone users[7].

However, not all strategic goals listed might be achievable due to constraints in the market such as competition from incumbents, prohibitive cost-basis for building technical capabilities necessary to compete and inability to forge long-standing relationships with participants in the mobile phone value chain.

Complete Technology Solution

Intel needs to identify what technologies it lacks in the portfolio and move to fill the gaps. For example, early on it knew that baseband processing was something it lacked while Texas Instruments was the market leader in baseband processing. It was easier for TI to acquire ARM core technology via licensing than it was for Intel, which already had the license to use ARM core, to build baseband-processing capability. Therefore Intel decided to acquire DSPC to complete its technology stack needed to compete in the mobile chipset market.

Joint Ventures and Partnerships

When Intel made entry in the non-cellular capable Personal Digital Assistants (PDAs) market it was hampered by cell phone manufacturers that smartly started incorporating PDA capabilities in smartphones. The new data-capable cellular networks that network operators were able to build further boosted the cell phone manufacturers. Incumbent mobile chipset and handset manufacturers had much stronger relationships with the operators that owned the data-capable networks.

A viable strategy for companies like Intel and Broadcom is to explore alternate paths to market entry, such as Original Design Manufacturer (ODM)-to-operator. In the ODM-to-operator model, Intel could manufacture chipsets for the ODMs to then optimize for cost and manufacture mobile phones for sale to the network operators. The network operators would then subsidize the cost of the phone in return for a service contract from the end user.

Chipset development can be a high cost endeavor if the company chooses to manufacture them. On the other hand, choosing the ODM-to-operator strategy can only be successful if the company enters into a relationship with leading handset manufacturers to ensure volume targets that are needed for reducing cost basis are reached, and sustained. Successful entry into emerging markets depends on the success of the strategies discussed above.

Finally, worth mentioning are the implications for companies attempting to introduce new cellular technology standards (think WiMAX). In my view, the key success factors for a company’s go-to market strategy when introducing a new cellular technology seem to be:

  1. Ecosystem supporting the new technology
  2. Choosing open standards

Additionally, companies introducing new technologies to the mobile industry need to also ensure they execute on each of the strategic elements in their go-to market strategy.

Ecosystem

Equipment manufacturers, cable companies and mobile operators form the value chain for cellular technology. It is vital to get commitment upfront from network operators that hold the most attractive spectrum before striking deals with equipment manufacturers to put the new WiMAX or FLASH technology in their devices.

Open Standards

Intel with WiMAX and other companies introducing new cellular technologies has a strategicdecision to make: deliver products based on proprietary technologies or back an open standard. The open standard is widely available to any equipment maker meaning the entrant will have to deal with competitors, however, market innovation will be high and if the right mix of product line and service line strategies are chosen, might even result in differentiated products that command price premiums.

Another key factor favoring open standards is the enabling of global economies of scale and decreasing costs. Complete standardization results in high volume of chips in the market that are based on this open standard, and therefore could be used inside the devices of every device manufacturer. The chip manufacturer gets to produce a high volume of chips thereby spreading fixed costs over a large number of units ultimately resulting in a lower per unit cost.

That’s all folks. Hope you enjoyed the post.

[1] Qualcomm 10-K, Nov 2013, http://tinyurl.com/n63vw6d

[2] Page 14, Intel in Wireless in 2006 (A): Tackling the Cellular Industry SM-165

[3] Page 168, Chapter 15, The Qualcomm Equation, Dave Mock

[4] Qualcomm 10-K, Nov 2004, http://tinyurl.com/lsgguy2

[5] Qualcomm 10-K, Nov 2013, http://tinyurl.com/n63vw6d

[6] “Qualcomm, Inc. 2004”, HBS Case. Data derived from Exhibit 5.

[7] Technology Intelligence & IP Strategy, http://tinyurl.com/mohl3z9

Posted in Strategy, Technology | Tagged , , , , , | Leave a comment

Haas VCIC: Would you invest?

Background

ABC is a hypothetical voice data processing company that has created technology to secure voice transactions. The worldwide market opportunity is approximately $5B-$7B. ABC is asking for $8M to enter the US and Australia markets, and develop the next generation of their products.

Overall Investment Decision and Explanation

Our investment decision is to invest $8M with a minimum 14% in equity in ABC in 2013. ABC is expected to attain 25% share of market by year 4 (2016). Using the EBITDA multiple approach, projected valuation for Year 4 is $104M with an IRR of 22%. Using the 10x Revenue multiple approach, projected intrinsic value of the firm for Year 3 is $144M with an IRR of 36%, and for Year 4 projected valuation is approximately $260M with an IRR of 66%. Using either approach, exiting in Year 4 meets and exceeds our stated exit objectives, although the 10x Revenue multiple is more aggressive and will need to be examined in more detail using precedent valuations in the US market. With their proven business model, sound business plan, and untapped market potential we felt there is a high probability to receive a 3x-4x return on our invested capital.

Valuation Process

To estimate the current value of the company, we need actuals to date data, however their 2013 year end forecast would suggest a $66M value based on 10x of expected revenue. In lieu of the actual data, we focused on the future value using a bottom-up valuation approach where we estimated ABC’s revenues using their UK contracts as proxies. Assuming a British Pound to US Dollar conversion rate of 1.6 to value the contracts in US dollars, we arrived at desirable Year 3 and Year 4 firm valuations using a 10x Revenue multiple. Next, we also used the 12x EBITDA multiple method with EBITDA/Revenues ratios taken from UK forecasts as inputs into our financial model. With this method we arrived at a desirable exit in Year 4. 

Expected Return

We benchmarked invested capital returns for VC firms and noted variance in expectation from seed to series D and beyond. The later the round, the lesser the upside for expected return. With this process in mind for a series B investment in Semafone, we expect a minimum of 3 to 4x, or approximately $32M in return. From our valuation due diligence we expect the return to be in the range of $40 to $70M.

Details

valuation_sema_v2

Thanks for reading!

Posted in Strategy, Valuation | Leave a comment

Gamification orthodoxies: intrinsic and extrinsic rewards

Engagement

Lets begin by defining the term and anchoring on the metrics used to measure it. The term indicates a connection between the business and the customer. No single metric I found sufficiently measures engagement. Website hits per second, page views per day or unique visitors per day fall short of identifying who is engaging with our products and ideas: our business as a whole. For informational purposes, the series of interrelated metrics devised by Zichermann, Cunningham (2011) are: Recency, Frequency, Duration, Virality and Ratings.

It is vital to remember that before we can begin designing games for engagement, we need to know what we want players to do- that is, what social actions we want to encourage. These are called “social verbs” that are verb forms (other than ‘buy’ or ‘consume’) and must be exhaustively listed, and then ranked according to your preference. The top 5 might be chosen for implementation. Also remember, that the reward-loop must be running infinitely because once you start rewarding the players, you cannot take it away from them. Use this constraint to compute the total cost of ownership of the gamification system.

Orthodoxy #1: Intrinsic motivation is better than extrinsic rewards

We need to accept the motivational states of the players as they are, and not try to change them at all. When we make the motivation extrinsic, we could potentially shift the locus of responsibility from hoping change happens, to a structure and process designed for making it happen.

Orthodoxy #2: Intrinsic motivators create greatness, while extrinsic motivators do not

Extrinsic motivators can be noble too. We, as game designers, are helping people (customers, sales force, managers) reach a higher potential, to discover ways of getting better job results.

Orthodoxy #3: The best gamification designs are intrinsic

No, they are not always. A good extrinsic motivation can be developed into a good map to intrinsic motivation. Take the case of the airline industry. Fly-by check-in and Priority boarding now seems normal- in fact it feels intrinsic, like it’s always been there. But it hasn’t. It is a reward that was created out of nothing when the airline decided to implement it for the business class passengers. As a demonstrative example, the airline then made Fly-by check-in available to the economy traveler who logged in say, 10 flights a month! Even though there was no reward previously for economy travelers , with just 10 flights a month, now there is! It has become intrinsic in nature.

The better a game designer knows the players, the better will be “engagement”. The players will want to feel over time that the game was their idea to begin with- their actions will no longer feel alien to them, but more natural and aligned to their personalities.

We have successfully gamed the players.

Posted in Gamification, Strategy | Tagged , , | Leave a comment

Economies of scale: the good and bad

Today, we’ll look at the other side of economies of scale in a common financial ratio used to forecast expenses: SG&A/Sales.

Fixed components of SG&A tend to give economies of scale to the SG&A/Sales ratio. Before taking an analytical approach (plotting %change in SG&A on the %change in sales each year), it is helpful to read the MD&A section of the 10k to uncover any potential movement in the ratio due to declared discretionary cost cutting. Also look for data on the historical movement of the ratio. Assuming there is evidence of economies of scale, plotting the ratio over time should give us an upward trend with %change in SG&A plotted on the Y-axis and %change in Sales plotted on the X-axis. In particular, the slop will be significantly less than one.

If the slope of the line is flatter when the %change in sales is negative, it also indicates that SG&A costs are declining but at a much slower rate. Therefore, it is harder to cut costs in bad times than it is to increase costs in good times! So when sales drop, economies of scale actually “cause” costs to stick around more. The point is that most managers tend to think of economies of scale as a GOOD THING: when sales increase costs do not increase as rapidly. True, but when sales decline the sword cuts the other way – costs do not fall as rapidly as sales.

This is typically the case when a firm is stuck with a cost structure that has a large fixed cost component. While it helps achieve economies of scale, it also imposes “operating leverage”, a type of risk, on the company. If the sales volume is highly variable then in periods of low volume the firm will be left high and dry, with all the fixed costs hanging around without sufficient sales to cover them, resulting in negative EBITDA margin!

There seems to be no way around this. A firm could reduce its operational risk by “renting” the infrastructure it needs for operations. However, when average selling price goes up, the renters jack up their rental fees, cutting into the potential profit margins of the risk-averse firm. No risk duly leads to minimal reward.

Hope you enjoyed this post!

Posted in Accounting, Economics, Finance | Tagged , | 1 Comment

Executive leadership for Security Management: guidelines for success

At the CIO/CISO levels, technology-based leaders are fast being replaced with executives who take a more holistic approach to technology security and risk management. Leaders sought after are those who understand operational effectiveness, governance and partnerships, and who possess exceptional leadership skills. These leaders realize that effective risk management programs will ultimately improve customer service and increase shareholder value. The modern firm is result-oriented and so must be its leaders. Devising novel ways of answering questions in the affirmative goes a long way in building consensus, however, it requires thinking of security as an enabler function of the business, and not as an operational burden.

Incentive structures at the CISO/CIO level now also reflect internal risk return vs. performance measures and controls implemented. Executives cannot work in silos anymore but must consider aligning investments in all business-related risk areas with external and internal exposures. Those who use established and standards-based enterprise IT or non-IT frameworks are at an advantage from the outset. ISACA’s case study that used COBIT 5 for non-IT related business strategy execution is a great example of an enterprise technology framework lending itself cleanly and seamlessly to strategy implementation using the ‘goal cascade’ methodology. Specifically, ISACA mentions, “the strategic initiatives were undertaken in such a way as to enable the goals cascade, i.e., that the needs of stakeholders (members, certification holders, others in ‘IT trust’ professions, and enterprises that are dependent on IT, among others) were reflected in appropriate organizational goals, the achievement of which would be enabled by achievement of the goals of the entire strategic portfolio, which in turn would be supported by achieving individual initiative goals”.

At a tactical level, security executives need to be able to think in terms of operational risk, because it is the common language the business will understand. The art of the leader will surface in how best she or he goes about building credibility and effectively tailoring the security message to the diverse risk appetites of the business. The ability to sell well-formulated risk strategies to insiders is another key leadership skill that will distinguish good executives from the average ones.

Soft skills that are usually not written down in the job description are vital for the security leader: these are traits, more differentiators actually, that add to overall effectiveness of the CISO/CIO office. Several communication models exist for executives to make use in trying to describe their fit for security executive roles: STAR (situation or task for context, articulate the action, explain the result) and PAR (describe the problem, articulate the action, explain the result) are examples. Collin Powell’s statement about leadership in general has profound implications for security executives: ‘Great leaders are almost always great simplifiers who can cut through argument, debate, and doubt to offer a solution everybody can understand.” Simplification is indeed a unique skill that security executives and managers must use to communicate risk mitigation plans to stakeholders.

Expertise is overrated. You’re only an expert for a few minutes, anyway! There is always someone who knows more. Arrogance, self-reliance and cavalier attitudes land security executives in trouble more often that we would like to think. Leaders must be able to collaborate between information security, privacy, risk management and governance functions. One way to do this is to rely on relationships, another is to develop key players into change agents: people who can carry your message far and wide into the firm. An extremely important trait is to be able to deliver the appropriate security message to the stakeholders, which requires having an understanding of business objectives, how security helps achieve these directly, and from a wider lens of the firm’s reputation in the market.

And don’t forget this: always ask for a security budget backed by a sound business model and financial plan!

Posted in Leadership, Strategy, Technology | 1 Comment

Risk Register: a key de-risking tool for a firm

Why do you need it?

Most widely accepted security and risk-oriented frameworks discuss the need to adopt a formalized and structured approach to risk management and security. Most of them do not ask you to create a “risk register” but imply the need for such a repository. The Risk IT Framework, published by ISACA and associated with the COBIT 5 framework, does explicitly talk about this as a requirement for compliance. Specifically, Process Goal RE3 (Risk Evaluation) under “Maintain your Risk Profile”: “Maintain an up-to-date and complete inventory of known risks and attributes (e.g., expected frequency, potential impact, disposition), IT resources, capabilities and controls as understood in the context of business products, services and processes.

In firms wishing to enhance maturity of risk management practices, systematic risk documentation and response can provide a solution accelerator, not in a prescriptive manner but as a platform upon which an improved set of practices can be built in the future. I have always maintained that managing risk is not a tactical maneuver but instead, it is a strategic choice. In over a decade of information security consulting I have mostly encountered firms not managing risk at all. Well, most do not document it to begin with!

The concept of the risk register is related to risk maps and risk scenarios. The risk register provides detailed information on each identified risk, including the risk owner, details of the risk scenario, details of the risk scores in the risk analysis, response and risk status. Risk registers serve as the primary reference for all risk-related information. Managers use the risk register prior to making risk-related decisions and there should be a strict “no-surprises” rule with risk registers, i.e., there is no new information here not already known to the firm: a risk register is just a convenient technique to store and maintain all collected information in a useful format for all stakeholders.

Template

Gartner has suggested firms use a table that allows a great deal of information to be conveyed in a few pages and provides flexibility in content and presentation as the document changes over time. The information recorded on the risk register could be structured as follows (derived from an ISO 27001 template):

ID Assign a unique reference in order to be able to identify each risk unambiguously (e.g. “15-2013” might be the 15th information security risk introduced into the analysis during 2013).  Sequential numbers allow the table to be sorted sensibly!
Risk Describe the information security risk briefly so that people will understand what risk you are assessing.
Asset owner Who is the Information Asset Owner, the person who will be held to account if the risk treatments are inadequate, incidents occur and the organization is adversely impacted?  It is in this person’s interest to assess and treat the risks adequately, or face the consequences.
Impact Describe the potential impacts should the risk occur, ideally in business terms.  Decide whether to use “worst case” or “anticipated” impacts and be consistent about it!  Consistency is especially important as the risk register gets larger and more people get involved in the assessments.
Raw probability Enter the probability or likelihood that the risk would occur if it was totally untreated, as a percentage value (see the section on scoring).
Raw impact Enter the potential business impact if the risk occurred without any treatment, as a percentage value (see the section on scoring).
Raw risk rating This is the product of the raw probability and impact values, in other words the raw/untreated/inherent level of risk.
Risk treatment Describe how the risk is to be treated.  Note that controls are just one option: risks can also be avoided, transferred or accepted.
Treatment cost Estimate the total cost of mitigating the risk.
Treatment status To what extent is the planned treatment in place?  0% means the treatment is only a plan at present – nothing has been done about it as yet.  100% means the treatment is fully operational.
Treated probability Enter the probability that the risk will eventuate once the controls etc. are fully in effect, in the same way as for before mitigation (see the section on scoring).  Treated values should be shown in bold if they are different to the raw values.
Treated impact Enter the likely impact once the controls etc. are fully in effect, in the same way as before mitigation (see the section on scoring).  Note that the impact of any incidents that actually do occur may increase with strong controls in place, since incidents due to control failures are not usually expected.  Treated values are shown in bold if they are different to the raw values.
Target risk rating This is the product of the anticipated probability and impact values once the risk treatment is fully implemented.
Current risk rating This is the risk rating today, given the implementation status and anticipated probability and impact values when fully completed.  For example, if the raw risk value is 50% and the treated risk value is 30% but the treatment is only 40% implemented, the current risk level is 32%.  In reality, many security controls are either fully implemented and fully effective, or partially implemented and not at all effective – but the risk calculation here takes into account the fact that work is under way, hence management can assume it will be treated in due course.
Notes Keep notes about the risks e.g. the factors you have taken into account and your reasoning, including any significant assumptions.
Last checked Record the date on which the risk was last reviewed, updated and/or approved by management.  Sort on this column to find risks that have not been checked in a long time and hence maybe should be reviewed or reconfirmed.
Residual risk assessment This describes the risk levels after consideration of mitigating controls and treatment plan. Typically, mitigating controls will affect likelihood, impact or both, resulting in a lower level of risk. If a mitigating control doesn’t have a noticeable impact on risk, then its efficacy should be re-evaluated. Risk-averse organizations may choose to consider both the “typical case” and “worst case” of a particular risk.

Risk scoring

An example structure for establishing scores is shown here:

risk scoring

 

Caution

What your risk register should not contain is also worth a mention. Since the risk register’s primary audience is “business folks”, or managers and senior executives, it should not be burdened with technical risks or threats. For example, don’t state that a particular set of servers is using a certificate that is susceptible to an MD5 collision vulnerability. Also, every conceivable threat should not be recorded, only the most significant ones that pose a serious challenge to the competitive position, operations and value system/chain of the firm. Keep separate details of the treatment plans, product information, etc as well since these items distract from the core idea behind the risk register: let the business stakeholders know what they are dealing with and have them formulate remediation strategies. Moving immediately to tool-based solutions assumes knowledge about the business impact of the risks that can only be properly defined by the business stakeholders themselves.

Hope you found the post useful!

Posted in Strategy, Technology | 1 Comment

Sensing new product innovations – a dynamic capability

Customers might be able to perceive the potential for applying new technology. Visionaries of customer organizations are often able to anticipate the potential for new technology and sometimes begin developing prototypes. If suppliers fail to understand the unmet need, it is likely any new products they develop will fail.

One of the most consistent findings from empirical research is that the probability that an innovation will be successful commercially is highly correlated with the developer’s understanding of customer needs (Freeman, 1974). Companies that are alert to these developments are often able to leverage customer-conceived ideas and prototypes into new products and services, as the customers themselves are frequently unable to make enterprise-ready technologies from initial prototypes.

Suppliers can also “supply” innovation important enough to be included in the final product. Such is the case classically with the microprocessor industry. This “upstream” innovation impacted competitive outcomes in PCs, cellphones and consumer electronics more generally. David Teece writes about how the failure to “design in” new technology/components in a timely fashion will lead to failure; conversely, success can sometimes be achieved by continuous rapid “design in”. With rapid innovation by suppliers, downstream competitive success will result from the ability of companies to quickly tap into the external innovation ahead of the competition- if they could just “sense” such developments in time!

Sensing opportunities and threats can also be facilitated if the strategy leader uses an analytical framework that surfaces most important directions. Within the dynamic capabilities framework, the environment is not simply that of the industry- as previously proposed by the Porter Five-Forces model- but that of the business ecosystem- the community of organizations, institutions, and individuals that impact the firm, its customers and suppliers. The community includes suppliers, complementors, regulatory authorities, standard-setting bodies, educational and research institutions.

See a related blog post discussing benefits of pursuing an ecosystem strategy at Business Strategies: Vertical Integration vs. Ecosystems.

Innovation and its supporting ecosystem infrastructure should and will have immense impact on competitive outcomes in the market. However, even when utilizing the ecosystem as the paradigm for sensing new innovations in the market, it is possible for the firm’s strategy leaders to miss the full import of market trends, industry performance statistics and other developments. Therefore, the evaluative and inferential skill (as Teece puts it) possessed by the managers of the firm becomes important. Decision relevance of information gathered purely “inside” the firm is rather low, and this information must be supplemented by customer insights, research and analysis from outsiders, market research, competitive response predictors, etc. Management needs to make sense of this data and prepare it for action. However, there is still too much to “make sense of” and attention is in short supply, as usual, what with near-term goals always exerting pressure on management. The firm’s articulated strategy can help out here by becoming a relevant and current filter so that attention is not diverted to every business opportunity or threat.

Hope you enjoyed this post!

Posted in Innovation, Strategy | Leave a comment

“Making Meaning”: Innovation core idea

In B2B settings, before we deliver a software or process solution, it is worthwhile and sometimes, imperative, to understand the organizational dynamics of the firm. The tacit or latent knowledge contained in the minds, experience and sensibilities of the workers is a pool of vital knowledge that “observer-consultants” should tap into routinely.

Ethnography is the rigorous study of the routine daily lives of people in a culture, or business in our case, with the intent of understanding those people from their point of view as well as our own. Specifically, its intent in business applications is to understand people’s desired outcomes, and find explanations for why people do what they do. Ultimately, it is aimed at uncovering unmet, or even previously unstated needs that indicate a “market gap” and point towards a set of solutions to fill the gap. Of course, finding new ways to fill these gaps in our lives is at the heart of innovation.

Such research simply seeks to find the “what” people are doing, and then proceeds to understanding “why” they do those things. The process moves from uncovering actions, to uncovering feelings and then on to uncovering ways to understand, interpret the findings. Rhea, Darrel, 2005 refer to this as “making meaning”.

The outcome of this observation of the subject(s) is a rich set of data that could be mined to develop explanatory models of the culture in a firm. Of course, this approach opens up the possibility of extending the models in an effort to create a foundation for changing the culture, by creating new products, programs, services and also gives us great insight into creating “games” that people could play in an effort to bridge the gaps we find and improve worker productivity.

Posted in Innovation, Strategy | Leave a comment