Você está na página 1de 9

S t r at e g y

The thesis here is that designs value should be a rich equation, measured within companies, across industries, as the result of design policies, and as part of the economy and society as a whole.

Anna Whicher, Research Officer, International Institute of Design Policy & Support, Design Wales, Cardiff

Gisele Raulik-Murphy, Senior Researcher, International Institute of Design Policy & Support, Design Wales, Cardiff

Gavin Cawood, Operations Director, International Institute of Design Policy & Support, Design Wales, Cardiff
44

Evaluating Design: Understanding the Return on Investment


by Anna Whicher, Gisele Raulik-Murphy, and Gavin Cawood

An increasing body of knowledge asserts the positive contribution of design to economic growth. In recent years, researchers and practitioners have strived to evaluate the impact of design at micro and macro levels in comparative studies around the world.1 Despite encouraging results, some of these methods remain im1. Designium, Global Design Watch 2010, SEE bulletin issue 5 (2011), University of Wales Institute, Cardiff, pp. 3-5; APCI, Economie du Design, Paris, 2010; Design Council, Design Industry Insights 2010, London, 2010; H. Hollander and A. Van Cruysen, Design, Creativity, and Innovation: A Scoreboard Approach, Pro Inno Europe, 2009; J. Moultrie, Developing an International Design Scoreboard, SEE bulletin issue 1 (2009), University of Wales Institute, Cardiff, pp. 3-6; KIDP, National Design Competitiveness Report 2008, Seoul, 2008.

practical for providing concrete input for informed and strategic policymaking. This is particularly significant at a time when design is rising up the policy agenda. Due to a myriad of converging factors, design policies are emerging and maturing across the globe. Not least among these factors is awareness of successful cases in which design has been integrated into a government strategy for economic growth. Asian and Scandinavian examples are among the most prominent, as was demonstrated in the most recent issue of this journal.

In Europe, design has received more and more attention at the policy level. In 2009, a European Commission survey asked about serious barriers to the better use of design in Europe. The most significant obstacle was considered to be lack of understanding of design among policymakers. The second was lack of knowledge and tools to evaluate the rate of return on design investment.2 Following the consultation, in October 2010, design was highlighted as a
2. European Commission, Results of the Public Consultation on Design as a Driver of User-Centered Innovation, Brussels, 2009.

2011 The Design Management Institute

45

Design Management Metrics: assessing Quality and Outcomes

priority in the Europe 2020 Flagship Initiative Innovation Union. As part of the Europe 2020 strategy, the European Commission has set three interconnected goals for future growthit should be smart, sustainable, and inclusiveand stated that design can contribute to all three factors.3 With design now firmly on the European political agenda, policymakers across Europe are looking to understand its role in innovation and its return on investment. If design is fully to justify its emerging profile at a policy level (particularly in Europe), researchers must answer a few fundamental questions: What are the challenges associated with evaluating design? How does design enhance a companys competitiveness? Is industry taking advantage of design resources? Has government investment in design programs and policies paid off? To what degree is design contributing to national development? As design climbs the policy agenda, the importance of addressing the evaluation of the return on design is more relevant than ever. Evaluation is a vital part of the evidence to support decision-making, and in the context of government cutsa searing current issue in Europeneeds to be
3. Buescher, R. Design in the Europe 2020 Agenda, speech at the Design and Learning conference, Brussels, November 25-26, 2010.

able to stand up to rigorous scrutiny. This article seeks to provide an overview of current practice in design evaluation and proposes a number of dimensions that must be taken into consideration when evaluating design at the micro and macro level in both private and public sectors: Return on investment in individual companies Return on investment in national industry Return on investment in design programs and policies Return on investment in economy and society
Levels of Design Evaluation

Differentiating these dimensions is vital for informing policy-makers about the different aspects of design practice. We note that evidence on the impact of design investment is needed at various levelsfrom investments made by single companies for their individual profit to the impact resulting from a policy that promotes design nationally. In proposing this framework (Figure 1), we intend to help researchers to understand these various dimensions and encourage them to conduct studies that will form a richer body of knowledge on the value of investments in design. Each dimension is illustrated by a case study representing current practice.

2 4 1 3
national industry individual companies PRIVATE SECTOR

Figure 1. Framework representing dimensions in evaluating design. Source: G. Raulik-Murphy, Evaluating DesignWorkshop Introduction, from SEE workshop Evaluating Design and Innovation Policies, Florence, Italy, May 10-11, 2010.

MACRO LEVEL MICRO LEVEL

national economy / society

individual programmes & policies

PUBLIC SECTOR

46

evaluating Design: Understanding the r eturn on I nvestment

evaluating the return on design in companies

There has been a drive over the past decade to investigate the return on investment (ROI) of design in individual companies to prove its commercial value, not only to potential clients but also to decision-makers in government. This is probably the dimension that has been best explored under current practice, mainly in the form of case studies published by individual design agencies and national design bodies. The European Commission recognizes that the results are compelling: companies that invest in design tend to be more innovative, more profitable and grow faster than those who do not.4 Nevertheless, measuring design in statistical terms remains problematic, since evaluation is costly and designs contribution cannot easily be extracted from the broader commercial context. As an example of this type of evaluation, we present Dave, winner of the 2008 DBAs Design Effectiveness Award. In a complex market, a small UK digital TV channel wanted to increase its share of the lucrative
4. European Commission, op. cit., p. 2.

16-to-44-year-old male audience. A unique rebranding exercise, which created a gentlemans club-style ambiance, resulted in remarkable growth. Dave leapt from the 29th biggest channel to the 10th and became the largest among the target segment, attracting eight million new view-

There has been a drive over the past decade to investigate the return on investment (ROI) of design in individual companies to prove its commercial value, not only to potential clients but also to decision-makers in government.
ers. A tight investment of 100,000 was transformed into a profit of 4.5 million in the first six months alone, and the channels incremental growth contributed a staggering 25 million in ad-sales revenue for 2008. Dave is now rapidly gaining market share at the expense of larger, more-established channels. This evidence proves the value of the design intervention for this business, proving the impact of design on individual companies is the first step in evaluation.
evaluating the return on design in national industry

When analyzing the ROI the private design sector offers at a macro level,

we can identify that design contributes in two waysas a sector by itself and as a strategic discipline that contributes to the growth of other sectors in the manufacturing and service industry. For governments to appreciate the significance of design both as a sector and as a strategic discipline, decision-makers need to have data on both aspects. Design adds value to individual businesses, and therefore it increases the value of national industry. In order to measure this impact, it is necessary to accumulate data about the contribution to individual companies (dimension 1) and how it can be multiplied to represent the impact on the industry as a whole. As a separate sector, the analysis of the design contribution demands data on the size of the design sector itself, its employment distribution, and financial contribution in order to promote an appreciation of the scale of the design industry and its contribution to competitiveness. Examples of surveys on the composition of the design sector have been conducted by a number of national design organizations. A selection would include the Seoul Design Centers Asia Design Survey 2009, the UK Design Councils

47

Design Management Metrics: assessing Quality and Outcomes

2010 Design Industry Insights, and Economie du Design, from lAgence Pour la Promotion de la Cration Industrielle. To highlight a study on measuring the impact that design has in the industry as a whole, we have selected a pioneering study conducted between 2003 and 2006 by the Danish Design Centre in association with the National Agency for Enterprise. Economic Effects of Design set out to measure the level of design activity in Danish businesses, applying a DDC methodology called the Design Ladder as a way of assessing the economic benefits of design in Denmark. The study focused on: Total investment in design Gross revenue performance and the development in employment and export share of turnover among the companies Difference in gross revenue, employment, and exports for companies that adopt a comprehensive approach to design compared with those who do not use design The study concluded that Danish companies invested an annual total of approximately DKK 7 billion in design. Overall, companies that invested in design showed an additional growth in gross revenue of 250% compared with companies that did

not. Linking performance data with investment in design thus revealed a correlation between design purchase and economic growth. Using the survey data, companies were categorized into four stages of design maturity, depending on their approach to design investment (Figure 2). The higher a company was ranked on the Design Ladder, the greater strategic importance it attributed to design. The Design Ladder is proving to be a successful tool for evaluating design. However, it is important to highlight that a key issue for a

successful measurement process is its systematic evaluation. Only the collection of data in consecutive periods provides comparative data and, therefore, meaningful results. By assessing how many companies move up a rung on the Design Ladder once design promotion and policies have been implemented, the Danish government was able to make a tangible assessment of the role of design in industry. Not long afterward, it implemented a policy to further support the national industrys use of design.

Figure 2. The Danish Design Ladder.

48

evaluating Design: Understanding the r eturn on I nvestment

evaluating the return on design programs and policies

Here we analyze investments made by the public sector at the micro levelthat is, in individual programs and policies for design support and promotion. This is an increasing concern, particularly in Europe, where ROI of public funds is a pertinent issue in light of government cuts. Design programs work to introduce design to individual companies or to encourage industry and the government itself to make better use of design resources. Typically, these programs are government-funded, so in this case evaluating their impact is a matter of accountability, as well as of improving the process of delivery. During the development of this research, the authors were able to work with the group of partners involved in the SEE projecta network of 11 design organizations in Europe working to integrate design into innovation policies. Sampling these 11 design programs, the project managers completed a self-assessment questionnaire to ascertain how effectively their design support programs were evaluated. Questions included: What were your programs targets? How frequently was it evaluated?

How were the delivery and impact of the program measured? How was data collected? What were the consequences of evaluation? Using the results of this exercise, we were able to identify some shortfalls that could be improved upon. Ideally, evaluation needs to take

Here we analyze investments made by the public sector at the micro levelthat is, in individual programs and policies for design support and promotion.
place at regular intervals to be effective. We observed that on average, the partners programs ran from three to five years; however, in five of the eleven cases, evaluation was conducted only at the end of the program, and only four programs were evaluated annually. The most effective evaluation models established benchmark points at the outset to demonstrate the impact of design intervention. We also discovered that significantly more emphasis was placed on the delivery of the program than on its impact. Typically, it was activities that were counted, such as how many seminars/ exhibitions had been organized, how many publications produced, how many SMEs were offered advice. (The repercussions of SME involvement in

these programs were not sufficiently captured.) Although in all cases, the program of activities was comprehensive, few had set their performance goals in objective, quantifiable, and measurable terms. Without clear targets, the appropriate data cannot be collected to assess whether the program is achieving its goals. For example, many of the objectives were intangible, such as improve cooperation between business and academia, enhance the competitiveness of SMEs, raise awareness of design. Also, many of the positive results, which were not included in the initial goals, were not taken into consideration in the evaluation process. Because some programs are repeated in periodic cycles, it is important that evaluation is taken into consideration at the end of each cycle to ensure improvement. Therefore, the real evaluation challenge is one of emphasizing learning and adaptation rather than of merely informing a decision as to whether to continue the program. In the context of government policies and programs, evaluation needs to be systematic, examining both operations (delivery) and outcomes (effects), comparing results

49

Design Management Metrics: assessing Quality and Outcomes

with initial and emerging goals, and using evaluation for ongoing improvement. Who undertakes this evaluation is open for discussion, but in the short term it will probably be up to those delivering design-related programs to find and undertake appropriate evaluation that will find acceptance with policy-makers over the long term.
evaluating the return on design in economy and society

What is designs contribution compared with that of other sectors and disciplines? As James Moultrie has pointed out, Whilst there is some evidence to demonstrate the value of design to the firm, there are very few studies that have successfully demonstrated the value of design at a regional or national level.5 Designs impact can be multi-dimensional economic, environmental, social. Currently, this broader dimension of our understanding of design is underdeveloped, since linking national design capabilities with economic performance entails inherent causality queries, particularly due to the scarcity of reliable data. Design is a dynamic tool for the innovation process; while innovation is well measured in many
5. Moultrie, J. (2009) Developing an International Design Scoreboard, SEE bulletin issue 1, University of Wales Institute, Cardiff, pp. 3-6.

scoreboards, such as the Community Innovation Survey, design is not similarly captured.6 The challenge remains one of how to include questions on design in such surveys. This would allow for a comparative analysis across Europe and beyond. Individually, countries can conduct an assessment of designs contribution to the economy in relation to other sectors. In Wales, the Creative and Cultural Industries Economic & Demographic Footprint research, developed by the Creative & Cultural
6. As mentioned in Hollander (op. cit.) and Moultrie (ibid.).

Skills council in 2008, is based on data collected from sources including the Annual Population Survey, the Inter-Departmental Business Register, and the Annual Business Inquiry. It defines the creative and cultural industries in terms of advertising, craft, cultural heritage, design, literature, music, performing arts, and visual arts. In Wales, 24,060 people are employed in the creative industries, and design accounts for the greatest proportion of Wales creative and cultural industries (22 percent). The Welsh creative industries contribute 465 million GVA annually to the
Advertising 1%

Literary Arts 6% Performing Arts 18%

Cr 15%

Cultural heritage >1%

Visual Arts 8% Design 36%

Music 15%

Figure 3. Creative industry productivity levels in Wales: gross value added.

50

evaluating Design: Understanding the r eturn on I nvestment

UK economy, of which 36 percent comes from design (Figures 3 and 4). The value of this research is that it contextualizes the role and contribution of design compared with other creative industries.
Conclusion

As demonstrated in the proposed framework, examining the rate of return on design investment is multidimensional and poses many challenges. Among the challenges associated with evaluating design to feed into policy are: Lack of common definitions and parameters for the design discipline Absence of commonly available measures, indicators, and statistics that can be compared internationally

Unclear criteria for success in projects, programs, or policies when objectives are not well defined at the outset Difficulties in isolating designs contribution and impact from the broader context (particularly distinguishing it from traditional measures for innovation) Costly evaluation process Political sensitivity to the results of the evaluation Failure to use the evaluation process as a management tool to improve delivery Although the case for designs contribution to business performance is strong, the extent of designs role as separate from the interaction of other disciplines remains subject to

skepticism. However, this should not halt the progress of policy discussion on the value of design for companies. Indeed, the EU recognizes that the findings of micro-economic research on design are conclusivethe use of design has a positive impact on the performance of a company, measured in terms of, for example, profitability, share price, employment, or exports.7 There is a need for pre- and postmeasurement across a combination of soft and hard indicators to obtain more evidence of the efficacy of design use within individual companies. By adding value to individual companies, design adds value to national industry. However, without data on the composition of the design sector in terms of employ7. European Commission, op. cit., p. 25.

Employment in the Creative Industries in Wales

22% 17%

14%

13%

11%

11%

10%

2% Design Music Cr Performing Arts Visual Arts Cultural Heritage Literary Arts Advertising

Figure 4. Employment in the creative industries in Wales.

51

Design Management Metrics: assessing Quality and Outcomes

ment, geographical distribution, and revenue, decision-makers will not be able to appreciate the scale of the design sector and its impact. The effect of design on industry as a whole, as well as an analysis of the growth of the design sector, can be measured over consecutive periods for comparable results. One-off studies are interesting but do not provide an insight into how the application of design resources is evolving. Research on this scale is costly but a number of organizations have already developed the processes for conducting surveys of this magnitude, and the data gathered in such exercises has proved valuable. Current practice in evaluating design programs reveals a need to broaden focus toward outcomes and impact, rather than concentrate narrowly on delivery and operations. A wide variety of design support programs operate in Europe. To improve those currently in existence and to make way for future practice, we must improve the effectiveness of future evaluation. The parameters for how programs are measured are often constrained by frameworks determined by the source of funding. This, although essential for the appropriate accountability of public funds, can

steer the operation of the program and may not reflect the intended objectives. Innovation is well captured in international surveys and scoreboards,

We conclude by reinforcing the complex nature of design evaluation, but highlight the need to overcome certain inefficiencies of current measuring practices.
which have enabled governments to enhance policy performance. Although design is gaining recognition at the policy level, there are no studies that can ascertain indisputably the causal links between design and socio-economic development, isolating design from a wider context. Nevertheless, studies that have provided evidence of positive correlations between design and its wider impacts on the economy and society should not be discouraged. The application of evaluation techniques for design needs to move beyond economic measures to capture social and environmental change. We conclude by reinforcing the complex nature of design evaluation, but highlight the need to overcome certain inefficiencies of current measuring practices. Improving our evaluation capabilities and understanding

why and when evaluation is needed are paramount in order to raise the awareness and the importance of design on the policy agenda, as well as improve the quality of the strategies for design practice and promotion. This article identifies the need for an evaluation process that embraces all four aspects of the frameworkpublic and private sectors at both micro and macro levels.
acknowledgements

This article is drawn from SEE Policy Booklet 3: Evaluating Design, a publication by the SEE project, which is co-financed by ERDF through the INTERREG IVC program (www. seeproject.org). The authors would like to thank and acknowledge the contribution made by all the SEE partners who participated at the SEE workshop when this content was initially discussed: Design Flanders, Danish Design Centre, Estonian Design Centre, Aalto University, ARDI Rhone-Alps Design Centre, Centre for Design Innovation, Consorzio Casa Toscana, Cieszyn Castle, BIO/Museum of Architecture and Design, and Barcelona Design Centre. n
Reprint #11222RAU44

52

Você também pode gostar