Ce diaporama a bien été signalé.
Nous utilisons votre profil LinkedIn et vos données d’activité pour vous proposer des publicités personnalisées et pertinentes. Vous pouvez changer vos préférences de publicités à tout moment.

Standards, terminology and Europe

Luigi Muzii's and Isabella Massardo's article on standards and terminology on the March 2016 issue of Multilingual.

  • Soyez le premier à commenter

  • Soyez le premier à aimer ceci

Standards, terminology and Europe

  1. 1. 35March 2016 Focus We use standards every day, in all aspects of our lives. Some standards have been around for hundreds or even thousands of years. Think, for example, of weights and measures and how their differences and similarities affect us all. Standards provide a shared reference framework that ensures safety, reliability, interoperability andtransparency,withpartnershavingcommonexpectationsoneach other’s performances and products. Translation and terminology standards In 1991, during the third TermNet Summer School in Vienna, Christian Galinski predicted that, before the turn of the millennium, the importance of terminology would eventually be universally acknowledged. Galinski also pre- dicted that terminology would reserve its own place among C-level executives. A quarter of a century later, terminology is still an ancillary discipline for a belittled profession, with a lot of specialized literature considering terminology. However, no standard effort, however smart, can keep the pace with technol- ogy evolution. Terminology plays a crucial role in accessing and managing information, especially today, but it is still a knowledge-intensive, labor-demanding human task, with users being more and more often unaware of — and possibly uninter- ested in — its principles and methods, and the many terminological standards available are becoming obsolete as soon as they are published because of the slowness of the process and the verticality of topics and efforts. Every year, TermNet, the Vienna- headquartered International Net- work for Terminology, organizes an online training with a final exam that requires the presentation of an application scenario. The course is sponsored by the European Certifica- tion and Qualification Association, a nonprofit association whose aim is to provide a worldwide unified certifi- cation schema for numerous profes- sions. Sessions are held by academics and experts tackling the main aspects of terminology management, with participants being given useful infor- mation and examples, but almost no practical exercises on term extrac- tion, stop-word list building, term data handling or other real-life sce- narios. Instead, much time is devoted to data categories, data modeling, semantic interoperability and even on team management theory. How much time can translators — be they freelance or in-house Standards, terminology and Europe Luigi Muzii is an independent consultant with over 30 years of ex- perience as a translator, localizer, technical writer, teacher and trainer. Isabella Massardo is an ECQA-certified terminol- ogy manager with 25 years of experience in the translation industry and is content manager at Memsource. Luigi Muzii Isabella Massardo
  2. 2. Focus 36 March 2016 linguists — really spend on terminol- ogy, if we consider the productivity level and the strict deadlines that are imposed by the various parties involved in a translation project? From experience we know that translators hardly have the time to quickly click on the concordance option in a computer-aided transla- tion (CAT) tool to browse through the translation memory they were given and add terms with a second click to a given term base. We also know that the exchange of term bases from one CAT tool to another will result in a loss of metadata and cause import problems. In 2007 Erin McKean, a lexicog- rapher and editor for the Oxford American English Dictionary, gave an enthusiastic TED Talk on the joys of lexicography. Her objective was clear even for a layman: the creation of an online dictionary collecting not only all the traditionally accepted words and definitions, but also new words and new uses for old words. The talk became a huge success. McKean heads Wordnik.com, the world’s biggest online English dic- tionary by number of words. Exam- ple sentences are pulled from major news media (such as the Wall Street Journal) and from books available in the public domain (Project Guten- berg and the Internet Archive), as well as from other sources across the web including less conventional ones such as blogs. The website also offers all sorts of information on each word: synonyms, hypernyms, hyponyms, words used in the same context, a reverse dictionary and tags. Of course, there are differences between lexicography and terminol- ogy. One might suffice for all; while the former is descriptive, the latter tends to be more normalizing, if not prescriptive. But Wordnik.com is pointing us in the right direction. Collaborative, cloud-based transla- tion environments that allow the sharing of linguistic data — in the form of translation memories and term bases — coming from all the parties involved in a translation proj- ect are the best way forward. A role for Europe The Old Continent is where stan- dardization was born and is still the translation studies homeland for many research, staffing and resource organizations. And yet, most efforts have been focusing on updating ter- minology and translation standards and issuing new ones, without giving evidence of their actual impact, if any, on the evolution of society. Like translation, terminology is a complex, time-demanding, knowledge- intensivetask,anditcanbehardtoshow its cost effectiveness and attractiveness. Possibly, potential users could benefit from the definition and actual spread- ing of basic criteria and requirements for using terminology and profit from it. Whilewearewriting,acontroversy is raging over the insolvency of four Italian regional banks. Unscrupu- lous staff from these banks allegedly pushed many unknowing customers to buy subordinated bonds. Custom- ers had to sign long contracts written in the typical abstruse language of finance without being provided with any explanations about the nature of the bonds they were buying, and they eventually lost their life’s savings. It would be difficult for anyone to find a comprehensive and yet concise description of what subordinated bonds are, for example. Wikipedia only offers an entry, in English, for “subordinated debt,” with the equiva- lent, in Italian, of debito non garantito (junior debt) containing a reference to an obscure credito chirografario (unse- cured debt). Forget InterActive Terminology for Europe (IATE), the interinstitu- tional terminology database of the European Union (EU) administered by the Terminology Coordination Unit (TermCoord) of the Directorate- General for Translation (DG TRAD) of the European Parliament. It has three entries for obbligazione sub- ordinata, all marked as reliable, but whose definitions are mostly overlap- ping and inconsistent with standard methodology. This should be solid evidence of the importance of terminology and of terminological resources, not only for translation but for everyday life. In fact, how many nonlinguists — and maybe even linguists — know of the existence of IATE? And yet, this is not an isolated case. Fifteen years ago, at Linate Airport in Milan, Italy, an SAS airliner carrying 110 people collided on take-off with a business jet carrying four people aboard.All114peopleonbothaircrafts were killed, as well as four ground personnel. Investigations identified a number of deficiencies in airport procedures, including violations of International Civil Aviation Organiza- tion regulations on the part of air traffic controllers, ranging from incorrect readbacks to the usage of non-standard phraseology in communications, with a specific irrelevant term — extension — leading to a fatal misunderstanding. Allthiscallsintoquestiontheweight and trustworthiness of terminology standards. We also need to mention that neither the International Orga- nization for Standardization (ISO) nor the other standard-setting bodies provide for any public termbase what- soever. As far as private termbases, a Common Sense Advisory survey revealed that only 41% of localization- mature organizations have some ter- minology management policy in place, almost solely translation-oriented. Tenyearsago,inanarticleinvolume 13 issue 3 of KMWorld titled "The high cost of not finding information," Susan Feldman reported that in 2001, the International Data Corporation (IDC) began to gather data on the costs an organization has to face when it doesn’t find the information it needs. IDC’s
  3. 3. Focus 37March 2016 study showed that knowledge workers spent 15% to 35% of their time search- ing for information, that searches were successfully completed 50% of the time or less, and that only 21% of workers found the information they needed 85% to 100% of the time. The time spent looking for information and not finding it cost an organization a total of $6 million a year, not including oppor- tunity costs or the costs of reworking the existing information that could not be located. The cost of reworking the information that was not found cost that organization a further $12 million a year (15% of time spent in duplicating existing information). The opportunity cost of not locating and retrieving information amounted to more than $15 million per year. Also, in a study for the EU-funded MULTIDOC project in 2010, Jörg Schütz and Rita Nübel claimed that terminology has a cost multiplier of ten for localization and of 20 for maintenance. Terminology management can be extremely costly in the short term, especially for a localization- negligent organization. According to a JD Edwards study presented at the TAMA conference in Antwerp in February 2001, one terminological entry has a cost of $150.00. Many potential terminology users are possibly not very interested in standards, but have an interest in the associate terminology. Of the hun- dreds of standards available at ISO and regional standards bodies, more than half contain terminology. This could then be harmonized, structured and made publicly and freely available. In November, the European Asso- ciation for Terminology will celebrate its 20th anniversary in the historical first hemicycle of the European Parlia- ment with a flashback on the activity in terminology during the past 20 years. At the event, a prize will be awarded for the best thesis on terminology. Rather than financing mammoth Directorate General for Translation (DGT)-oriented educational pro- grams with the typical EU regulatory aim (have you ever heard of the bendy banana law?), the DGT could fund a program for the consolidation of the many dust-collecting terminological archives scattered all along the Old Continent in its innumerable universi- ties. This program could be entrusted to a pool of outstanding graduates from the universities feeding the ranks of underpaid DGT interns. Futurists, visionaries and wishful thinkers In the last two decades, the ability of effectively using and integrating a wide range of software tools forming the typical translator’s toolbox has become pivotal.Today,translatingislessaques- tion of language knowledge and more one of knowing how to use it and the right tools to exploit it. The integration of machine translation (MT) into the now widespread, comprehensive and increasingly mundane translation tools is making MT and post-editing part of a translator’s daily job. The last year marked the final statement for data as the lifeline of our online existence. With hardware increasingly being commoditized and software simply a click away, data is gold. Machine learning technologies are revolutionizing everything, from image recognition to voice transcrip- tiontoMT.Thesetechnologiesrequire massive amounts of training data. Translators will have to be able to build parallel corpora, produce, access anduse(big)data,processunstructured dataset to mine, as well as produce and manage rich terminology data. Terabytes of translation data are produced in Europe alone every year. But, as Andrew Joscelyne and Anna Samiotou recently explained in the “TAUS Translation Data Landscape Report,” data sources are heteroge- neous and unbalanced, and private owners can be reluctant to give their translationdataforfreeoreventoopen source it. Traditional public sources of translation data are no longer enough. Incentives are necessary for a transla- tion open data project in order to prevent any conflicts of interests. [M] www.hansemeug.com ISO17100 Certified by LICS ISO 9001:2008 Certified by CERMET