L’etimologia di Genova

Perché Genova si chiama così? Dipende dall’epoca in cui fate questa domanda.

Oggi il calendario segna 2015 quindi lasciamo perdere la (interessante ma ben nota) paretimologia medievale di Ianua e quella molto meno interessante che rimanda al termine greco xenos. Parliamo dell’etimologia “vera” di Genua, attestata per la prima volta in un cippo miliario dell’anno 148 a.C. (CIL I¹ 540 = CIL V 8045).

L’ipotesi principale è che genua sia un termine indoeuropeo, che significherebbe “bocca” (*genaua), riferito alla foce del fiume ‒ il Bisagno. La tesi è stata formalizzata da Xavier Delamarre che nota nel suo Noms de lieux celtiques de l’Europe ancienne. Dictionnaire (p. 13, nota 5; traduzione mia):

Per limitarsi alla toponomastica, è notevole che l’antico nome di Genova, Genua, porto ligure per antonomasia, abbia una costruzione precisamente simile a quello della gallica Ginevra, Genava, entrambi esito di *Genoṷā, derivazione in -ā di un tema *genu- che indica la bocca in celtico (irlandese gin bocca, gallese gên ‘mascella’), e quindi per estensione ‘l’imboccatura’. Ora, se la derivazione semantica bocca → imboccatura, porto è banale e universale (latino ōsōstium, tedesco MundMündung, finlandese suu ‘bocca’ → (joen)suu, etc), è in celtico e solo in celtico che il tema indoeuropeo *ǵénu- / *ǵonu- che inizialmente indica la mascella o le guance (latino genae, gotico kinnus, sanscrito hanu-, etc.) è passato per metonimia a designare la bocca. Il nome «ligure» del porto di Genua è pertanto costruito su un tema la cui semantica è specificamente celtica.

Tra gli archeologi Delamarre ha trovato un primo forte sostegno da parte di Filippo Maria Gambari. La validità di questa ipotesi è slegata dalla attribuzione della lingua ligure preromana alla famiglia indoeuropea o al substrato pre-indoeuropeo, proprio perché il nome è attestato solo in epoca così tarda, e quindi potrebbe essere una acquisizione linguistica dalla lingua celtica in una situazione ‒ attestata anche archeologicamente ‒ di commistione celto-ligure. Le scoperte archeologiche dell’ultimo decennio nella zona della foce del Bisagno rafforzano questa ipotesi e indeboliscono molto la precedente ipotesi, sempre di ambito indoeuropeo, che indicava una possibile radice *genu– “ginocchio”, riferita ad un altra caratteristica geografica di Genova: l’insenatura del porto.

È interessante come in entrambe le ipotesi sia stata data per acquisita la coincidenza dell’etimologia di Genova con quella di Ginevra (registrata per la prima volta come Genaua nel De Bello Gallico), come indicato ad esempio sul Wikeriadur Brezhoneg (wikizionario bretone). Il bretone (con alcune lingue affini) e il gallese sono di fatto le uniche lingue che permettono di indicare questa parola come celtica, creando quindi un possibile legame etimologico. In effetti nel Catholicon breton (1464), la più antica testimonianza scritta di questo termine è indicata come guenou (mentre nel bretone contemporaneo il lemma è genoù), quindi il termine antico è più difforme dalla forma “celtica” rispetto a quello contemporaneo. In gallese genau indica “mouth, lips; estuary, entrance to a valley, pass, mouth (of sack, cave, bottle, &c.), hole; fig. saying, speech.” (Geiriadur Prifysgol Cymru). Sia il gallese sia il bretone sono considerate lingue celtiche “insulari”, cioè parzialmente distinte dalle lingue celtiche parlate sul continente (a maggior ragione in Liguria) e ora estinte.

L’archeologa Piera Melli, accettando questa ipotesi, sostiene nel suo recente volume Genova dalle origini all’anno Mille che potrebbe anche essere avvenuta una “etruschizzazione” del nome, in cui sarebbe stato sostanzialmente preso a modello il nome di  kainua (Marzabotto) e di altri nomi di città etrusche come mantua e padua. Tuttavia questa forma etrusca del nome di Genova non è attestata, e rimane una suggestione legata alla relativa abbondanza di iscrizioni in alfabeto etrusco rinvenute a Genova.

Genova nacque alla foce del Bisagno, ma era solo l’inizio.

Pottery and archaeology on the Web

Today marks five years since Tiziano Mannoni passed away.

There’s one thing that always characterised his work in publications and lectures: a need to visualise anything from research processes to production processes and complex human-environment systems in a schematic, understandable way. The most famous of such diagrams is perhaps the “material culture triangle” in which artifacts, behaviors and significants are the pillars on which archaeology is (or should be) based.

As a student, I was fascinated by those drawings, to the point of trying myself to create new ones. In 2012, in a rare moment of lucidity, I composed the diagram below trying to put together several loosely-related activities I had been doing in the previous years. Not much has changed since then, but it’s interesting to look back at some of the ideas and the tools.


Pottery 2.0

Kotyle is the name I gave to a prototype tool and data format for measurements of the volume/capacity of ceramic vessels. The basic idea is to make volume/capacity measurement machine-readable and allow for automated measurements from digital representations of objects (such as SVG drawings). Some of the ideas outlined for Kotyle are now available in a usable form from the MicroPasts project, with the Amphora Profiling tool (I’m not claiming any credit over the MicroPasts tool, I just discussed some of the early ideas behind it). Kotyle is proudly based on Pappus’s theorems and sports Greek terminology whenever it can.

SVG drawings of pottery are perhaps the only finalised item in the diagram. I presented this at CAA 2012 and the paper was published in the proceedings volume in 2014. In short: stop using [proprietary format] and use SVG for your drawings of pots, vases, amphoras, dishes, cups. If you use SVG, you can automatically extract geometric data from your drawings ‒ and maybe calculate the capacity of one thousand different amphoras in 1 second. Also, if you use SVG you can put links to other archaeological resources such as stratigraphic contexts, bibliographic references, photographs, production sites etc directly inside the drawing, by means of metadata and RDFa.

Linked Open Archaeological Data (with the fancy LOAD acronym) is without doubt the most ambitious idea and ‒ unsurprisingly ‒ the least developed. Based on my own experience with studying and publishing archaeological data from excavation contexts, I came up with a simplified (see? I did this more than once) ontology, building on what I had seen in ArchVocab (by Leif Isaksen), that would enable publication of ceramic typologies and type-series on the Web, linked to their respective bibliographic references, their production centres (Pleiades places, obviously) and then expand this to virtually any published find, context, dig, site. Everything would be linked, machine-readable and obviously open. Granularity is key here, and perhaps the only thing that is missing (or could be improved) in OpenContext. A narrow view of what it may look like for a single excavation project is GQBWiki. I don’t see anything similar to LOAD happening in the near future however, so I hope stating its virtual existence can help nurture further experiments in this direction.

The original case study for LOAD is ARSILAI: African Red Slip in Late Antique Italy, that is my master’s thesis. The web-based application I wrote in Django naturally became the inspiration for creating a published resource that could have been constantly updated, based on feedback and contributions from the many scholars in the field of late Roman pottery. Each site, dig, context, sherd family, sherd type, ware has a clean URI, with sameAs links where available (e.g. sites can be Pleiades places, digs can be FastiOnLine records). Bibliographic references are just URIs of Zotero resources, since creating bibliographic databases from scratch is notoriously a bad idea. In 2012 I had this briefly online using an AWS free tier server, but since then I have never had again the time to deploy it somewhere (in the meantime, the release lifecycle of Django and other dependencies means I need to upgrade parts of my own source code to make it run smoothly again). One of the steps I had taken to make the web application less resource-hungry when running on a web server was to abandon Matplotlib (which I otherwise love and used extensively) and create the plots of chronology distribution with a Javascript library, based on JSON data: the server will just create a JSON payload from the database query instead of a static image resulting from Matplotlib functions. GeoJSON as alternate format for sites was also a small but useful improvement (and it can be loaded by mapping libraries such as Leaflet and OpenLayers). One of the main aims of ARSILAI was to show the geospatial distribution of African Red Slip ware, with the relative and absolute quantities of finds. Quantitative data is the actual focus of ARSILAI, with all the implications of using sub-optimal “data” from literature, sometimes 30 years old (but, honestly, most current publications of ceramic contexts are horrible at providing quantitative data).

So the last item in the “digital approaches to archaeological pottery” toolbox is statistics. Developing open source function libraries for R and Python that deal with commonly misunderstood methods like estimated vessel equivalents and their statistical counterpart, pottery information equivalents (pie-slices). Collect data from bodysherds with one idea (assessing quantity based on volume of pottery, that I would calculate from weight and thickness sherd-by-sherd) just to find out an unintended phenomenon that I think was previously unknown (sherd weight follows a log-normal or power-law distribution, at any scale of observation) Realise that there is not one way to do things well, but rather multiple approaches to quantification based on what your research question is, including the classic trade networks but also depositional histories and household economics. At this point, it’s full circle. The diagram is back at square one.

What’s the correlation between the exposure time of your photographs and the time of the day?

My digital photo archive spans 15 years and holds about 12,600 pictures (not so many, after all). I’m curious to see if there is a correlation between the exposure time of my photographs and the time of the day they were taken. A rather simplistic observation, perhaps.

In short: there’s nothing spectacular about this correlation, but it’s nice. The morning hours are the ones with the lowest average exposure (the plot is reversed, so you can look at familiar integer numbers) time at around 1/320 s between 9 and 10 AM. There’s a sharp increase between 12 and 1 PM, then it increases again after 4 PM towards dusk. I don’t take many pictures at night!

See for yourself.


The most frequent values for exposure time are in the table below. 1/30 s is the typical exposure time when using the flash, and it’s recognisable in the plot above.

1/n s occurrences
800 1986
1000 1178
30 943
400 547
250 488
640 458
200 450
500 388
320 342
160 337

The Python and R scripts are at https://gitlab.com/steko/expotime (giving GitLab a spin since GitHub is a monopoly and I don’t like that). I’m still doing some experiments with the source data, then I’ll upload those as well.

The life of Andrew of Crete

Andrew of Crete was a (famous) archbishop of Crete during the early 8th century. He is a venerated as a saint by both the Orthodox and the Catholic church, and even today he is particularly appreciated as a hymnographer.

Andrew was born in Damascus, spent his early years in Jerusalem (and that’s why he’s also known as Andrew of Jerusalem) and he only spent a fraction of his life in Crete, but from his biography we learn that he was particularly active in the “typical” activities of an archbishop, such as building new churches, taking care of the existing ones, etc. Given the scarcity of written sources about Crete in the 8th century, the life of Andrew of Crete is an important historical document for the study of this period. However, it should be noted that no more than half of the text is about the years Andrew served as ἀρχιεπίσκοπος τῆς Κρητῶν φιλοχρίστου νήσου, archbishop of the Christian island of Crete.

I heard several mentions of this biography in Rethymno last month so I was rather curious. To my discontent, I could only find the same boilerplate summary of the life everywhere on the many thematic websites dedicated to Christian saints, and even Wikipedia doesn’t stand out as particularly accurate. What is really disappointing is that nowhere was a reference to the actual biography! Still, that was published in Saint Petersburg in 1898, in volume 5 of the Ανάλεκτα Ιεροσολυμιτικής σταχυολογίας, with the descriptive title of Βίος του εν αγίοις πατρὸς ημων Ανδρέου του Ιεροσολυμίτου, ὰρχιεπισκὸπου γενομένου Κρήτης. This reference is found in some scholarly publications, such as G. Kourtzian’s «L’incident de Knossos (fin Septembre/début Octobre 610)», Travaux et mémoires, vol. 17, 2013, p. 182 (even though the reference has the wrong publication year). The links above point to the Internet Archive: the life of Andrew of Crete is actually available on the Web but it is difficult to find. Unfortunately the Google scan on the Internet Archive is missing two pages (170-171).

There are two manuscripts for the biography of Andrew, one from the monastery of Vatopedi, the other from Agios Dionisios, both on Mount Athos.

I have started a transcription at Wikisource. I had never typed polytonic Greek before, but it’s very easy to get a decent typing speed on GNU/Linux following the document Writing Greek, Greek Polytonic (Ancient Greek) on Linux by Simos Xenitellis.

Interlude pages for my PhD thesis (with sketch drawings!)

Who said PhD theses have to be boring texts with horrible typography?

Even if my thesis is far from being ready for discussion, I can’t help some diversion from the actual writing. Today I put together this experiment for an interlude page: imagine you’re skimming through dozens of pages and suddenly your eyes catch something different: a short sentence at font size 36, coupled with a rough sketch drawing of a Byzantine cooking pot, or the interior of a cellar where a young girl is walking to bring wine to the table.

Challenging myself to hand-drawingThe drawings are mine ‒ pencil on pieces of recycle paper with minor passages of digital editing and vectorisation. You may like them but they’re not sketchy for an artistic choice, that’s just the best I am able to do with my bare hands. Some practice might help, I am told.

The text is typeset in the Brill font, that is only free for personal use, but I like it and I wanted to experiment. Alegreya, Linux Libertine, Source Serif all look good on that page, too. I think it needs a serif font.

Does this bring more value to the surrounding pages? I’m not sure, to be honest. It could be said that they distract from the actual content, that is supposed to be of academic value, and that this kind of page layout is best left for architecture and design magazines. However, not everyone is going to read your PhD thesis from cover to cover, and a bit of typographic color here and there will not hurt.


William Gibson, archaeologist

Earlier this year, in cold January morning commutes, I finally read William Gibson’s masterpiece trilogy. If you know me personally, this may sound ironic, because I dig geek culture quite a bit. Still, I’m a slow reader and I never had a chance to read the three books before. Which was good, actually, because I could enjoy them deeply, without the kind of teenage infatuation that is quickly gone ‒ and most importantly because I could read the original books, instead of a translation: I don’t think 15-year old myself could read English prose, not Gibson’s prose at least, that easily.

I couldn’t help several moments of excitement for the frequent glimpses of archaeology along the chapters. This could be a very naive observation, and maybe there are countless critical studies that I don’t know of, dealing with the role of archaeology in the Sprawl trilogy and Gibson’s work in general. Perhaps it’s touching for me because I deal with Late Antiquity, that is the closest thing to a dystopian future that ever happened in the ancient world, at least as we see it with abundance of useless objects and places from the past centuries of grandeur. Living among ruins of once beautiful buildings, living at the edge of society in abandoned places, reusing what was discarded in piles, black markets, spirituality: it’s all so late antique. Of course the plot of the Sprawl trilogy is a contemporary canon, and the characters are post-contemporary projections of a (very correctly) imagined future, but the setting is, to me, evoking of a world narrative that I could embrace easily if I had to write fiction about the periods I study.

Count Zero is filled with archaeology, of course especially the Marly chapters. Towards the end it gets more explicit, but it’s there in almost all chapters and it has something to do with the abundance of adjectives, the care for details in little objects. Mona Lisa overdrive is totally transparent about it, since the first pages of Angie Mitchell on the beach:

The house crouched, like its neighbors, on fragments of ruined foundations, and her walks along the beach sometimes involved attempts at archaeological fantasy. She tried to imagine a past for the place, other houses, other voices.

– William Gibson. Mona Lisa Overdrive, p. 35.

But really, you just have to follow Molly along the maze of the Straylight Villa in Neuromancer to realize it’s a powerful theme of all the Sprawl trilogy.

The Japanese concept of gomi, that pervades Kumiko’s view of Britain and the art of Rubin in the Winter Market, is another powerful tool for material culture studies, at least if we have to find a pop dimension where our studies survive beyond the inevitable end of academia.

Being a journal editor is hard

I’ve been serving as co-editor of the Journal of Open Archaeology Data (JOAD) for more than one year now, when I joined Victoria Yorke-Edwards in the role. It has been my first time in an editorial role for a journal. I am learning a lot, and the first thing I learned is that being a journal editor is hard and takes time, effort, self-esteem. I’ve been thinking about writing down a few thoughts for months now, and today’s post by Melissa Terras about “un-scholarly peer review practices […] and predatory open access publishing mechanisms” was an unavoidable inspiration (go and read her post).

Some things are peculiar of JOAD, such as the need to ensure data quality at a technical level: often, though, improvements on the technical side will reflect substantially on the general quality of the data paper. Things that may seem easily understood, like using CSV for tabular data instead of PDF, or describing the physical units of each column / variable. Often, archaeology datasets related to PhD research are not forged in highly standardised database systems, so there may be small inconsistencies in how the same record is referenced in various tables. In my experience so far, reviewers will look at data quality even more than at the paper itself, which is a good sign of assessing the “fitness for reuse” of a dataset.

The data paper: you have to try authoring one before you get a good understanding of how a good data paper is written and structured. Authors seem to prefer terse and minimal descriptions of the methods used to create their dataset, giving many passages for granted. The JOAD data paper template is a good guide to structuring a data paper and to the minimum metadata that is required, but we have seen authors relying almost exclusively on the default sub-headings. I often point reviewers and authors to some published JOAD papers that I find particularly good, but the advice isn’t always heeded. It’s true, the data paper is a rather new and still unstable concept of the digital publishing era: Internet Archaeology has been publishing some beautiful data papers,and I like to think there is mutual inspiration in this regard. Data papers should be a temporary step towards open archaeology data as default, and continuous open peer review as the norm for improving the global quality of our knowledge, wiki-like. However, data papers without open data are pointless: choose a good license for your data and stick with that.

Peer review is the most crucial and exhausting activity: as editors, we have to give a first evaluation of the paper based on the journal scope and then proceed to find at least two reviewers. This requires having a broad knowledge of ongoing research in archaeology and related disciplines, including very specific sub-fields of study ‒ our list of available reviewers is quite long now but there’s always some unknown territory to explore, for this asking other colleagues for help and suggestions is vital. Still, there is a sense of inadequacy, a variation on the theme of impostor syndrome, when you have a hard time finding a good reviewer, someone who will provide the authors with positive and constructive criticism, becoming truly part of the editorial process. I am sorry for the fact that our current publication system doesn’t allow for the inclusion of both the reviewers’ names and their commentary  ‒ that’s the best way to provide readers with an immediate overview of the potential of what they are about to read, and a very effective rewarding system for reviewers themselves (I keep a list of all peer reviews I’m doing but that doesn’t seem as satisfying). Peer review at JOAD is not double blind, and I think often it would be ineffective and useless to anonymise a dataset and a paper, in a discipline so territorial that everyone knows who is working where. It is incredibly difficult to get reviews in a timely manner, and while some of our reviewers are perfect machines, others keep us (editors and authors) waiting for weeks after the agreed deadline is over. I understand this, of course, being too often on the other side of the fence. I’m always a little hesitant to send e-mail reminders in such cases, partly because I don’t like receiving them, but being an annoyance is kind of necessary in this case. The reviews are generally remarkable in their quality (at least compared to previous editorial experience I had), quite long and honest: if something isn’t quite right, it has to be pointed out very clearly. As an editor, I have to read the paper, look at the dataset, find reviewers, wait for reviews, solicit reviews, read reviews and sometimes have a conversation with reviewers, if something is their comments are clear and their phrasing/language is acceptable (an adversarial, harsh review must never be accepted, even when formally correct). All this is very time consuming, and since the journal (co)editor is an unpaid role at JOAD and other overlay journals at Ubiquity Press (perhaps obvious, perhaps not!) , usually this means procrastinating: summing the impostor syndrome dose from criticising the review provided by a more experienced colleague with the impostor syndrome dose from being always late on editorial deadlines yields frustration. Lots. Of. Frustration. When you see me tweet about a new data paper published at JOAD, it’s not an act of deluded self-promotion, but rather a liberatory moment of achievement. All this may sound naive to experienced practitioners of peer review, especially to those into academic careers. I know, and I still would like to see a more transparent discussion of how peer review should work (not on StackExchange, preferably).

JOAD is Open Access. It’s the true Open Access, not to differentiate between gold and green (a dead debate, it seems) but between two radically different outputs. JOAD is openly licensed under the Creative Commons Attribution license and we require that all datasets are released under open licenses so readers know that they can download, reuse, incorporate published data in their new research. There is no “freely available only in PDF”, each article is primarily presented as native HTML and can be obtained in other formats (including PDF, EPUB). We could do better, sure ‒ for example, provide the ability to interact directly with the dataset instead of just providing a link to the repository ‒ but I think we will be giving more freedom to authors in the future. Publication costs are covered by Article Processing Charges, 100 £, that will be paid by the authors’ institutions: in case this is not possible, the fee will be waived. Ubiquity Press is involved in some of the most important current Open Access initiatives, such as the Open Library of Humanities and most importantly does a wide range of good things to ensure research integrity from article submission to … many years in the future.

You may have received an e-mail from me with an invite to contribute to JOAD, either by submitting an article or giving your availability as a reviewer ‒ or you may receive it in the next few weeks. Here, you had a chance to learn what goes on behind the scenes at JOAD.