Pages

Tuesday, September 27, 2011

The Growing Interest & Concern About the Future of Professional Translation

I have noticed of late that every conference has a session or two that focuses on the future. Probably because many sense that change is in the air. Some of you may also have noticed that the protest from some quarters has grown more strident or even outright rude, to some of the ideas presented at these future outlook sessions. The most vocal protests seem to be directed at predictions about the increasing use of machine translation, anything about “good enough” quality and the process and production process changes necessary to deal with the increasing translation volume. (There are still some who think that the data deluge is a myth). 

Some feel personally threatened by those who speak on these subjects and rush to kill or at least stab the messenger. I think they miss the point that what is happening in translation, is just part of a larger upheaval in the way global enterprises are interacting with customers. The forces causing change in translation are also creating upheaval in marketing, operations and product development departments as many analysts have remarked for some time now. The discussion in the professional translation blogosphere is polarized enough (translators vs. technology advocates) that dialogue is difficult, but hopefully we all continue to speak with increasing clarity, so that the polemic subsides. The truth is that none of us really knows the definite future, but that should not stop us from making educated (or even wild) guesses at where current trends may lead. (I highly recommend you skim The Cluetrain Manifesto to get a sense for the broader forces at play.)
Brian Solis has a new book coming out that describes the overall change quite succinctly. The End of Business As Usual (his new book) explores each layer of the complex consumer revolution that is changing the future of business, media, and culture. As consumers further connect with one another, a vast and efficient information network takes shape and begins to steer experiences, decisions, and markets. It is nothing short of disruptive.
I was watching the Twitter feed from two conferences last week (LRC XVI in Ireland and Translation Forum in Russia)  and I thought it would be interesting to summarize and highlight some of the tweets as they pertain to this changing world and perhaps provide more clarity about the trends from a variety of voices and perspectives. The LRC conference had several speakers from large IT companies who talked about their specific experience, as well as technology vendor and LSP presentations. For those who are not aware, CSA research identifies IT as one of the single largest sectors buying professional translation services. The chart below shows the sectors with the largest share of global business. This chart is also probably a good way to understand where these changes are being felt most strongly.
image

Here are some Twitter highlights from LRC on the increasing volume of translation, changing content, improving MT and changing translation production needs. I would recommend that you check out @therosettafound for the most complete Twitter trail. I have made minor edits to provide context and clarify abbreviations and have attempted to provide some basic organization to the tweet trail to make it somewhat readable.
image
@CNGL Changing content consumption and creation models require new translation and localisation models – (according to) @RobVandenberg
@TheRosettaFound We are all authors, the enterprise is going social - implications for localisation?
@ArleLommel Quality even worse than Rob Vandenberg says: we have no real idea what it is/how to measure, especially in terms of customer impact
Issue is NOT MT vs. human translation (HT). It's becoming MT AND HT. Creates new opportunities for domain experts.
Dion Wiggins. LSPs not using MT will put themselves out of business? Prediction: yes in four/five years
CNGL says 25% of translators/linguists use MT. I wonder how many use it but say they don't use it due to (negative) perception (with peers)?
Waiting for translation technology equivalent of iPhone: something that transforms what we do in ways we can't yet imagine.

Tweets from Jason Rickhard’s presentation on Collaborative Translation (i.e. Crowdsourcing) and IT go Social.
@TheRosettaFound Jason of Symantec giving the enterprise perspective, added 15-20 languages to small but popular product, built tech to support this. Not just linguistic but also legal, organizational issues to be resolved in collaborative, paid-for product.
Is collaborative translation bad & not-timely? #lrcconf Not so, a lot of translators = involved users of the content/product they translate.
Review process is different in collaborative translation. Done by voting, not by editors
The smaller the language gets, the more motivated volunteer translators are and the better collaborative translation works.
Is volunteering something for people who don't have to worry that their day-to-day basics are covered?
Does collaborative translation and collaboration mean that content owners "give up the illusion of control" over their content?
Enterprises do collaborative translation for languages they would/could not cover otherwise - true, for both profit and non-profits
Collaborative/Community will not replace existing service providers but open up more content for more languages
Language Service Providers could play an important role in community translation by building, supporting, moderating communities
It's not correct to say Community Translation = bad; Professional Translation = good
Microsoft appoints moderators with a passion for the project/language for community localization
>1,200 users translated Symantec products into 35 languages
If >1,200 were needed to translate 2 small-ish products, how can millions of translators translating 1 ZB be 'managed'?
@ArleLommel Symantec research: Community involvement in support often leads to ~25% reduction in support spend
“Super users” are what make communities scalable. Key is to identify/cultivate them early in the process
Jason Rickard: Dell is a good example of using Facebook for support. One of few companies with real metrics and insight in this area.
Jason Rickard: Symantec has really cool/systemic/well-thought ways to support community


@TheRosettaFound 21st generation localisation is about the user, about user-generated content - Ellen Langer: Give up the Illusion of Control
@ArleLommel Illusion of control? You mean we can have even less control that we have now? That's a scary thought!
@TheRosettaFound The most dramatic shifts driven by the web happened because communities took over - Imagine: 100000s of user translators translating billions of words into 100s of languages - control that!
Seems the deep and complex problems of localisation are a minute drop in the ocean of digital content management
@CNGL Discovery, analysis, transformation - Alex O'Connor tells how CNGL is addressing the grand challenges of digital content management
@TheRosettaFound Is the L10N industry due for a wave of destruction or positive transformation?
@ArleLommel Yes, Most of the mainstream technologies for translators are non-ergonomic and still in 20-year-old paradigms

Tweets from Tony Allen, Product Manager Intel Localisation Solutions presentation
@TheRosettaFound 30+ langs >200k pages >40% localised @ Intel's web presence. Intel: important to have user-driven content, interaction with the customer. Integration important, e.g. multilingual support chat. Integration, Interoperability key issues for Intel L10N. To figure out how content flows, without loss of metadata, interoperates with internal/external range of systems, is crucial.
2.5b netizens, >15b connected devices >1 zetabyte of traffic by 2015 and companies will interact with their customers using social media - type setups; new challenges for localization.
#intel What does it mean for localization infrastructures if we have >1 zetabyte of content in 2015? Current methods won't keep up
@ArleLommel #intel says that interoperability standards are required for cloud to meet future demands. L10n must evolve to meet this need too.

@ArleLommel Alison Toon (#hp) puts it this way: “localization (people) are the garbage collectors of the documentation world”
@TheRosettaFound 600GB of data in Oracle's Translation Platform - We need concise well-structured content - then we're going to be able to deliver efficient translation services - How to get it right: analyze content, identify problems and throw it back into the face of writers and developers. I18N and l10n have to get into the core curriculum at Universities says Paul Leahy (of Oracle), since we spend too much time teaching it.

Tweets from Sajan / Asia Online MT presentation
@TheRosettaFound MT cannot perform magic on bad source text - user-generated non-native-speaker content is often 'bad'
MT errors make me laugh... but human errors make me cry - an old quote from previously recycled presentations... Asia Online
Dirty Data SMT - what kind of translations would you expect? If there are no humans involved you are training on dirty data, says Asia Online. Sajan achieved 60% reduction in costs and 77% time savings for specific project - a miracle? Probably no, let’s see.
Millions of words faster, cheaper, better translated by Sajan using Asia Online - is this phenomenal success transferable? How?
XLIFF contributed to the success of Sajan/Asia Online's MT project. Asia Online's process rejected 26% of TM training data.

Tweets from Martin Orsted, Microsoft presentation
@TheRosettaFound Cloud will lead to improved cycle times and scalability: 100+ languages, millions of words
Extraordinary scale: 106 languages for the next version of Office. Need a process that scales up & down in proportion.
Microsoft: We have fewer people than ever and we are doing more and more languages than ever
Martin: "The Language Game - Make it fun to review Office"... here is a challenge :) Great idea to involve community via game
How can a "Game" approach be used for translation? Levels of experience, quality, domains, complexity; rewards?
No more 'stop & go', just let it flow @robvandenburg >>Continuous publishing requires continuous translation. New workflows

Tweets from Derek Coffey, Welocalize presentation Are we the FedEx or the WallMart of words?
@TheRosettaFound TMS of SDL = burning stacks of cash - Reality: we support your XLIFF, but not your implementation
Lack of collaboration, workflow integration, content integration = most important bottle necks. Welocalize, MemoQ, Kilgray and Ontram working on reference implementation - Derek: Make it compelling for translators to work for us
It's all about the translators and they will seek to maximise their earning potential according to Derek.

Tweets from Future Panel
@TheRosettaFound Many translators don't know what XML looks like
Rob: more collaborative, community translation - Rob: Users who consume content will have a large input into the translation BINGO
Tony: users will drive localisation decision, translation live
Derek: future is in cooking xxx? Open up a whole new market - user generated, currently untranslatable content. HUGE market
Derek: need to re-invent our industry, with focus on supply chain
The end of the big projects - how are we going to make money (question from audience)
From service/product to community - the radical change for enterprises, according to Fred
No spark, big bang, revolution - but continuous change, Derek
Big Spark (Dion): English will no longer remain the almost exclusive source language

image
The Translation Forum Russia twitter trail has a much more translator oriented focus and is also bilingual. Here are some highlights below, again with minor edits to improve readability.

@antonkunin Listened to an information-packed keynote by @Doug_Lawrence at #tfru this morning. As rates keep falling, translators' income keeps rising.
@ilbarbaro Talking about "the art of interpreting and translation" in the last quarter of 2011 is definitely outdated
Language and "quality" are important for translators, speed and competence for (final) clients. Really?
Translators are the weakest link in the translation process
Bert: here and now translation more important than perfect translation
Bert on fan subbing as an unstoppable new trend in translation
Is Bert anticipating my conclusions? Noah's ark was made and run by amateurs, RMS Titanic by professionals
Carlos Incharraulde: terminology is pivotal in translators training < Primarily as a knowledge transfer tool
To renatobeninatto at who said: Translation companies can do without process standards < I don't agree
@renatobeninatto: Start looking at income rather than price/rates
Evaluating translation is like evaluating haircuts - it's better to be good on time than perfect too late
Few translation companies do like airlines: 1st class/ Economy/ Discount rates – Esselink
Traditional translation models only deal w/ tip of iceberg. New models required for other 90%. Esselink
Good enough revolution. Good enough translation for Wikileaks, for example. Bert Esselink
In 2007 English Russian was $0.22 per word, in 2010 it dropped to $0.16 @Doug_Lawrence
There's much talk on innovation but not much action - don't expect SDL and Lionbridge to be innovative
@Doug_Lawrence all languages except German and French decreased in pricing from 2007 to 2010
@AndreyLeites @ilbarbaro problem-solving is the most important feature translator should acquire - Don't teach translators technology, teach them to solve problems - language is a technology, we need to learn how to use it like technology - 85% of translators are still women
@ilbarbaro 3 points on quality: 1. Quality is never absolute, 2. Quality is defined by the customer, 3. Quality can be measured - it is necessary to learn to define quality requirements (precisely)
@Kilgraymemoq announces that they will open Kilgray Russia before the end of the year

This is of course my biased extraction from the stream, but the original Twitter trail will be there for a few more weeks and you can check it out yourself. It is clear to me from seeing the comments above, that at the enterprise level, MT and Community initiatives will continue to gather momentum.  Translation volumes will continue to rise and production processes will have to change to adapt to this. Also, I believe, there are translators who are seeking ways to add value in this changing world and I hope that they will provide the example that leads the way in this changing flux.

And for a completely different view of "the future" check this out.


Monday, September 12, 2011

Understanding Where Machine Translation (MT) Makes Sense

One of the reasons that many find MT threatening I think, is the claim by some MT enthusiasts that it that it will do EXACTLY the same work that was previously done by multiple individuals in the “translate-edit-proof” chain without the humans, of course. To the best of my knowledge this is not possible today, even though one may produce an occasional sentence where this does indeed happen. If you want final output that is indistinguishable from competent human translation, then you are going to have to use the human “edit-proof” chain to make this happen.
image

Some in the industry have attempted to restate the potential of MT from Fully Automated High Quality Translation - FAHQT (Notice how that sounds suspiciously like f*&ked?) to Fully Automated Useful Translation – FAUT. However, in some highly technical domains it is actually possible to see that carefully customized MT systems can outperform exclusively human-based production, because it is simply not possible to find as many competent technical translators as are required to get the work done.
image

We have seen that both Google and Bing have gotten dramatically better since they switched from RbMT to statistical data-driven approaches, but the free MT solutions have yet to deliver real compelling quality outside of some Romance languages, and the quality is usually far from competent human translation. They also offer very little in terms of control, even if you are not concerned about the serious data privacy issues that their use brings to the user. It is usually worthwhile for professionals to work with specialists who can help them customize these systems to the specific purpose they are intended for. MT systems evolve and can get better with with small amounts of corrective feedback if they are designed from the outset to do this. Somebody who has built thousands of MT systems, across many language combinations, is likely to offer more value and skill than most can get from using tools like Moses building a handful of systems, or even the limited dictionary building input possible with many RbMT systems. And how much better can customized systems get than the free systems? Depending on the data volume and quality, it can range from small but very meaningful improvements to significantly better overall quality.

So where does MT make most sense? Given that there is a significant effort required to customize an MT system, it usually makes most sense when you have ongoing high volume, dynamically created source data and tolerant users or any combination thereof. It is also important to understand that the higher the quality requirements, the greater the need for human editing and proofing. The graphic below elaborates on this.
image

While MT is unlikely to replace human beings in any application where quality is really important, there are a growing number of cases that show that MT is suitable for:
  • Highly repetitive content where productivity gains with MT can exceed what is possible with just using TM alone
  • Content that would just not get translated otherwise
  • Content that simply cannot afford human translation
  • High value content that is changing every hour and every day but has a short shelf life
  • Knowledge content that facilitates and enhances the global spread of critical knowledge, especially for health and social services
  • Content that is created to enhance and accelerate communication with global customers who prefer a self-service model
  • Real-time customer conversations in social networks and customer support scenarios
  • Content that does not need to be perfect but just approximately understandable

So while there are some who would say that MT can be used anywhere and everywhere, I would suggest that a better fit for professional use is where you have ongoing volume, and dynamic but high value source content that can enhance international initiatives. To my mind, customized MT does not make sense for one-time, small localization projects where the customization efforts cannot be leveraged frequently. Free online MT might still prove of some value in these cases, to boost productivity, but as language service providers learn to better use and steer MT, I expect that we will see that they will provide translators access to “highly customized internal systems”  for project work, and the value to the translators will be very similar to the value provided by high quality translation memory.  Simply put – it can and will boost productivity even for things like user documentation and software interfaces.
image

It is worth understanding that while “good” MT systems can enhance translator productivity in traditional localization projects, they can also enable completely new kinds of translation projects that have larger volumes and much more dynamic content. While we can expect that these systems will continue to improve in quality, they are not likely to produce TEP equivalent output. I expect that these new applications will be a major source for work in the professional translation industry but will require production models that differ from traditional TEP production.
image  image

However, we are still at a point in time where there is not a lot of clarity on what post-editing, linguistic steering and MT engine refinement really involve. They do in fact involve many of the same things that are of value in standard localization processes , e.g. unknown word resolution, terminological consistency, DNT lists and style adjustments. They also increasingly include new kinds of linguistic steering designed to “train” the MT system to learn from historical error patterns and corrections. Unfortunately many of the prescriptions on post-editing principles available on LinkedIn and translator forums, are either linked to older generation MT systems (RbMT), systems that really cannot improve much beyond a very limited point or are linked to a specific MT system. In the age of data-driven systems new approaches are necessary and we have only just begun to define this. These new hybrid systems also allow translators and linguists to create linguistic and grammar rules around the pure data patterns. Hopefully we will see much better “user-friendly” post-editing environments that bring powerful error detection and correction utilities into much more linguistically logical and pro-active feedback loops. These tools can only emerge if more savvy translators are involved (and properly compensated) and we successfully liberate translators from dealing with the horrors of file and format conversions and other non-linguistic tedium that translation today requires. This shift to a mostly linguistic focus could also be much easier with better data interchange standards. The best examples of these are from Google and Microsoft rather than the translation industry, thus far.

image

Talking about standards, possibly the most worthwhile of the initiatives focusing on translation data interchange standards is meeting in Warsaw later this month. The XLIFF symposium IMO is the most concrete and most practical standards discussion going on at the moment and includes academics, LSPs, TAUS, tools vendors and large buyers sharing experiences. The future is all about data flowing in and out of translation processes and we all stand to benefit from real, robust standards that work for all constituencies.

Monday, September 5, 2011

The Continuing Saga & Evolution of Machine Translation

I recently attended the 7th IMTT Conference in Cordoba, Argentina. I especially enjoy the IMTT events because somehow they have found a content formula that works for both translators and LSPs. You get to see the translation supply chain communicate in real-time. The overall culture of their events is also usually very collaborative, and to my mind the place to see the most open and constructive dialogue between translators and agencies. Some may not be aware that Argentina has a particularly strong concentration of skilled humans who understand the mechanics of localization (especially in FIGS BrPt), and many of the agencies, even small ones, are able to work with pretty much every TM and TMS (Translation Management System) system in the market with more than a basic level of competence. Because of historical decisions made by @RenatoBeninatto many years ago and a great university educational system, Argentina has become a place with a comprehensive and sizable professional translation eco-system.
317201_10150297433683886_254788843885_7822180_1515745_n
There were a few presentations that I found especially interesting, including a plenary presentation by Suzanne de Santamarina on the use of quality metrics. You can see some of the twitter trail here and here but basically Suzanne described her very active use of J2450 measurements to improve the dialogue on quality with customers and with her translators. While there clearly is effort and expense involved in implementing this as actively as she has, I think it dramatically improves the conversation regarding translation quality between all the participants, as it is very specific and impersonal and clear about what quality means.  It is also a means to build what she called “customer delight” which of course also includes a major service component.
Quality in a product or service is not what the supplier puts in. It is what the customer gets out (of the product/service) and is willing to pay for. A product is not quality because it is hard to make and costs a lot of money, as manufacturers typically believe. This is incompetence. Customers pay only for what is of use to them and gives them value. Nothing else constitutes quality…
~ Peter Drucker
j2450
Asia Online makes a software tool available to enable customers to calculate J2450 scores for this very reason.  It helps to move the discussion from inactionable complaints like “I don’t like the quality” or “the quality is not good”,  to practical error identification and resolution action steps like  “Is there a way to reduce frequency of the wrong terminology errors in the system?” Just as proper use of BLEU scores requires care and some expertise so does the use of J2450. Suzanne’s company’s regular and highly structured use of J2450 is such that they can really assess the linguistic quality from project to project with a precision that  few have. Her approach is refreshing in its clarity and precision and quite a contrast form the meandering inconclusive discussions on “quality” that you see in LinkedIn. Tools like BLEU and J2450 depend on the skill level of the user, and require an investment of time and effort and repeated use to develop real user competence before one understands the informational insights that their use can provide.
Human Quality Assessment
I also enjoyed the presentations by master translators like João Roque Dias and Danilo Nogueira  on the craft and art of translation, and enjoyed talking to them about MT and the life of the translator in general. (Yes, MT is sometimes useful even for some of them.) There were several skill focused presentations on Language QA tools, CAT and collaborative tools that were also very interesting. I heard great things about Val Ivonica’s presentation (in Brazilian) on translation productivity tools which I was unable to attend as it coincided with mine. It is interesting that Patricia Bown positioned MemoQ as collaboration software that enables the linear TEP model to evolve, enabling faster turnaround and higher volume. There were many Brazilians present (though some said not enough) and they lived up to their reputation for revelry but unfortunately were thwarted in their (our) attempts to find a karaoke place one evening. Nevertheless they shared their linguistically oriented humor with me and I had no difficulty finding a willing interpreter even though I was often the only person who did not speak the language.

I delivered a presentation on the emerging role of MT as a means to deal with the translation challenges created by the content explosion and new kinds of dynamic product/business related content. The feedback I received was mostly positive and constructive even though there were several very skeptical translators in the crowd. There were some in the audience who have already experienced MT that works and even those who had not worked with customized systems admitted that sometimes MT was useful.  I was also on a panel on “The Future of the Industry” which got mixed reviews as some translators felt it was not relevant and others felt it was a tired topic that nobody had any real clarity on. Many feel change is coming but are not clear what this really means and unfortunately for many the end-result of these changes is that customers expect more work for less money. This does mean that there is a certain amount of apprehension amongst the attendees as the future is not quite predictable.
313481_10150297437053886_254788843885_7822203_7110892_n
A blog entry by Emily Stewart that pondered upon the theme of technology driven change at the conference a few days later, triggered an interesting and on-going discussion in LinkedIn.  Her post which was about the advent of technology in a variety of different markets is thoughtful and worth reading. I also think her conclusion (shown below) is good advice for us all.
Instead of denouncing machine translation as the end of the translation world as we know it, it may be time to take a step back and see what happens.  The discussion shouldn’t stop, but perhaps it could become less polemic and instead convert into a deeper conversation on and reflection of what may or may not lie ahead.
While initially there is a lot of focus on the perceived threat (there are some who think that I, together with other over zealous MT developers, am responsible for some of this fear and FUD), I am hoping that the dialogue moves beyond this point. Some MT systems have indeed been used to push rates down unfairly, but as we all begin to better understand these early mishaps, this can and must change. As George W Bush misspoke when he tried to say "Fool me once, shame on you; Fool me twice, shame on me."  (I hope you click on the Bush video, it is toooo funny). If it becomes clearer to everybody what it actually takes to “finish off” MT output to required target quality levels, this kind of abuse cannot continue. We need better quality assessment so that this gap can be more clearly defined.

All MT systems are not equal and to have a global post-editing pricing policy is guaranteed to create disenchantment. We all need to better understand where to use MT and where to avoid it. MT cannot easily if ever replace humans, on the same projects that were previously done through a careful human TEP process. If the quality expectations are high, it has to be MT and human.  MT makes most sense where there is ongoing volume and information volatility. We also need to better understand how to quickly assess the output quality of different MT systems so that post-editors are compensated fairly. The best MT systems are yet to come and they will be better because they are the product of informed linguistic steering in addition to standard data and MT techniques. We have yet to see useful compensation systems develop for these linguists and this will probably be needed before some of the uneasiness dissipates, but the forces driving this expanding need are strong and hopefully we should realize and recognize the value of these key individuals at some point in the future. This is already true at Asia Online so I imagine it can be done elsewhere.

In terms of disintermediation, I think MT will be only part of the whole picture, as we see more people learn to use motivated communities to get work done. Adobe and others have learnt to use “the crowd” to get traditional localization work done using translation platforms like Lingotek and newcomer Smartling (which might also have obtained the biggest startup investment made by a VC in the translation industry.) Much of the coming change will also come from collaboration software platforms like Lingotek, Smartling and others yet to come, that change how translation projects get done in terms of process flow, and that have a different modus operandi from traditional localization tools born in the TEP world. Translators are required to spend too much time working with data in different formats and too little time on the actual act of translation. New collaboration platforms and real data interchange standards will hopefully enable translators to focus mostly on real linguistic problem solving, and not on managing archaic and arcane format interchange issues. 

From my vantage point, I see that -
1) Translation is increasingly done outside of the sphere of the localization world and community based translation initiatives around the world are gathering momentum both in the non-profit and corporate world
2) The volume of translation that can help drive international business initiatives forward is increasing at a substantial rate (5X to perhaps as much as 100X) Interestingly, there are still some who think that this content explosion is a myth.
3) Social network conversations matter and are often more important to translate than having user documentation that is "perfect" and “error-free”. The company to customer communications have also become much more interactive, real-time and urgent and go way beyond the scope of most user documentation. 

Thus to approach every translation task with the TEP mindset that made great sense in the 1X or 2X volume days is not useful today. New approaches are needed and new models of automation/collaboration are necessary - and are perhaps the only way that all the changing momentum can be handled effectively. MT is simply one part of the equation and is far from being the whole solution. The need to solve this overall translation challenge is linked to the customers business survival so it gains a kind of momentum of inevitability. Businesses need to translate way more content to remain competitive in global marketplaces that move at internet speeds, thus automation and better collaboration are essential and critical to success and even survival.
 
We have seen in the last 5 years that many of the largest global organizations have launched MT initiatives on their own, because their LSP vendors were/are stuck in the TEP mindset, and did not realize that their customers had to learn to do dramatically more translation with not very much more money. This is perhaps a clue that in certain volume and time constraint scenarios, MT is necessary. We have seen that global enterprises need to solve this problem with or without vendors who historically managed the bulk of their localization translation. My sense is that this trend is likely to build momentum if LSPs do not learn to offer real MT competence. Real MT competence comes from building custom systems and seeing what works and what does not. Global enterprises will increasingly take this task upon themselves if they cannot find LSPs who can help them solve this problem e.g. TAUS is mostly a buyer driven organization with the key focus of sharing TM and facilitating large scale MT initiatives. The greatest successes presented at TAUS are all in-house initiatives with little LSP involvement. Surely this is because there is a real need, and we see that competitors are willing to share linguistic data and resources to handle this problem. I suspect that the buyer’s motivation is less about saving cents per word on translation costs, and much more about keeping and building customer loyalty and satisfaction in a world with growing global online commerce and information access needs.

My guess is that some of the anxiety on the coming change comes not so much from raw technology like MT, but perhaps it's real origin is the growing awareness that some of the work they are involved with grows less valuable to the customer’s real mission: which is to build and develop international markets. Perhaps the anxiety is really rooted in the fact that they sense that they are not involved in high value work. The real threat is not MT per se, but it is the growing awareness amongst international marketing executives (in global enterprises) that they need to focus on what their customers really care about - more and more often this is something other than getting a really great user manual out. Have you noticed that many leading edge companies like Apple, Sony have dramatically reduced their investment in user manuals? The iPhone simply does not have one (in the box but they do on the web). I am not suggesting that manuals are going away, but it is already clear that their relative value is diminishing. The content that drives global customer adoption and loyalty is changing and thus the relative value of traditional localization (software and documentation) work also changes.

I expect that new translation production models to build success in international markets will involve MT (and other translation automation), crowdsourcing as well as professional oversight and management. It is very likely that old production models like TEP will be increasingly less important, or just one of several approaches to translation projects as new collaboration models gain momentum.I think that the most successful approaches to solving these "new" translation problems  will involve a close and constructive collaboration between traditional localization professionals, linguists, MT developers, end-customers and probably others in global enterprise organizations who have never worked in "localization" but are more directly concerned about the quality of the relationship with the final customer across the world. At the end of the day our value as an industry is determined by how useful our input is to the process of building international markets and the requirements for success are changing as we speak.

The conversations at IMTT and the ensuing discussions suggest that while progress is being made in the understanding of translation technology, there is still a long way to go. I hope that at future IMTT conferences we see more discussion of approaches to translation projects where TEP may not make sense and automation and collaboration approaches can help solve different kinds of problems that also further international business initiatives. I expect that IMTT will be a leader in changing the current polemic and also expand the conversation to new stakeholders. This conversation is likely to require much more direct content with product management, international sales and support teams and the final end customer. Hopefully some of us in the industry get to lead or participate in  the driving this change through these new conversations.

While change can be difficult it can also be a time of opportunity and a time when leadership changes. Very few try to understand the forces of change better. People often go through a sequential emotional cycle before they learn to cope, and eventually even thrive when facing disruptive change. Those who get stuck at fear and despair, often end up as victims.

This little video shows that effective and heartfelt communication across cultures need not be heavily planned, ponderous or calculated. Sometimes simple and real is enough to create the change and build a connection to your customers.

Where the Hell is Matt? (2008) from Matthew Harding on Vimeo.