Feeds:
Posts
Comments

Posts Tagged ‘google’

Photo:The Metropolitan Museum of Art.
Google’s Aeneas AI program proposes words to fill the gaps in worn and damaged artifacts. 

Whenever I start to worry that Google has too much power, it does something useful. Today’s story is about its artificial intelligence program Aeneas, which can make a guess about half-obliterated letters in ancient inscriptions.

Ian Sample, science editor at the Guardian, writes, “In addition to sanitation, medicine, education, wine, public order, irrigation, roads, a freshwater system and public health, the Romans also produced a lot of inscriptions.

“Making sense of the ancient texts can be a slog for scholars, but a new artificial intelligence tool from Google DeepMind aims to ease the process. Named Aeneas after the mythical Trojan hero, the program predicts where and when inscriptions were made and makes suggestions where words are missing.

“Historians who put the program through its paces said it transformed their work by helping them identify similar inscriptions to those they were studying, a crucial step for setting the texts in context, and proposing words to fill the inevitable gaps in worn and damaged artifacts.

” ‘Aeneas helps historians interpret, attribute and restore fragmentary Latin texts,’ said Dr Thea Sommerschield, a historian at the University of Nottingham who developed Aeneas with the tech firm. …

“Inscriptions are among the most important records of life in the ancient world. The most elaborate can cover monument walls, but many more take the form of decrees from emperors, political graffiti, love poems, business records, epitaphs on tombs and writings on everyday life. Scholars estimate that about 1,500 new inscriptions are found every year. …

“But there is a problem. The texts are often broken into pieces or so ravaged by time that parts are illegible. And many inscribed objects have been scattered over the years, making their origins uncertain.

“The Google team led by Yannis Assael worked with historians to create an AI tool that would aid the research process. The program is trained on an enormous database of nearly 200,000 known inscriptions, amounting to 16m characters.

“Aeneas takes text, and in some cases images, from the inscription being studied and draws on its training to build a list of related inscriptions from 7th century BC to 8th century AD. Rather than merely searching for similar words, the AI identifies and links inscriptions through deeper historical connections. …

“The AI can assign study texts to one of 62 Roman provinces and estimate when it was written to within 13 years. It also provides potential words to fill in any gaps, though this has only been tested on known inscriptions where text is blocked out.

“In a test … Aeneas analyzed inscriptions on a votive altar from Mogontiacum, now Mainz in Germany, and revealed through subtle linguistic similarities how it had been influenced by an older votive altar in the region. ‘Those were jaw-dropping moments for us,’ said Sommerschield. Details are published in Nature. …

“In a collaboration, 23 historians used Aeneas to analyze Latin inscriptions. The context provided by the tool was helpful in 90% of cases. “’t promises to be transformative,’ said Mary Beard, a professor of classics at the University of Cambridge.

“Jonathan Prag, a co-author and professor of ancient history at the University of Oxford, said Aeneas could be run on the existing corpus of inscriptions to see if the interpretations could be improved. He added that Aeneas would enable a wider range of people to work on the texts.

“ ‘The only way you can do it without a tool like this is by building up an enormous personal knowledge or having access to an enormous library,’ he said. ‘But you do need to be able to use it critically.’ “

More at the Guardian, here. Please remember that this free news outlet needs donations.

Read Full Post »

As we have noted in other posts on the subject, one of the most ephemeral forms of art is street art. Many street artists like it that way, but others hate to see the work disappear.

Deborah Vankin and Saba Hamedy write at the Los Angeles Times that Google has decided to do something about that.

“A new worldwide database of public art aims to preserve — if only in digital form — street art, a medium that is often political, sometimes renegade and, perhaps most important, frequently fleeting. These are artworks that may get tagged by graffiti or fall into decay because of weather exposure. The accessible, populist nature of the medium — buildings and sidewalks as canvases — also is what makes them vulnerable. …

” ‘You never know when a mural will be scrubbed out or painted over,’ said Lucy Schwartz, program manager for the Google Cultural Institute, the umbrella organization that this week launched an expanded version of its searchable database of photos simply called Street Art. ‘Our goal is to offer a permanent home for these works so users today and tomorrow can enjoy them and learn about them.’ …

“The project launched in June 2014 with 5,000 images and 31 partnering organizations internationally. [In March] Google added 55 partners who have helped to document more than 5,000 more pieces of public art, all viewable at streetart.withgoogle.com/en/. The collection includes Australia, Sweden, Colombia, South Africa —34 countries in all. It also includes mobile apps and listening tours, as well as a map on which visitors can click to browse local art. …

“Google has said the street art in its online project cannot be downloaded, and the company credits all featured artists. Images in Street Art also include the title of the mural and the date it was created.” More here.

As you might imagine, some of the artists working in this form were highly skeptical of Google’s outreach to them. The sort of Buddhist acceptance of the transitory nature of all things certainly seems incompatible with a Google database. But for many artists, digital preservation is welcome.

Photo: Noel Celis / AFP/Getty Images
A mural by an unidentified artist in Manila. Jan. 26, 2015.

Read Full Post »