X Close

UCLDH Blog

Home

Menu

Archive for the 'Research Papers' Category

Vision for Art (VISART) Workshop for interdisciplinary work in Computer Vision and Digital Humanities

Lucy Stagg26 April 2022

The VISion for Art (VISART) workshop is an interdisciplinary workshop held with the European Conference on Computer Vision (ECCV) on a bi-annual basis. The workshop is now on its 6th edition and has had great success over ten years since starting in Florence (2012), with the 2022 edition in Tel Aviv, Israel. The success has led to VISART becoming a staple venue for Computer Vision and Digital Art History & Humanities researchers alike. With the workshop’s ambition to bring the disciplines closer and provide a venue for interdisciplinary communication, it has, since 2018, provided two tracks for both the technological development and the reflection of computer vision techniques applied to the arts. The two tracks are:

1. Computer Vision for Art – technical work (standard ECCV submission, 14 pages excluding references)
2. Uses and Reflection of Computer Vision for Art (Extended abstract, 4 pages, excluding references)

Full details are available at the workshop website: https://visarts.eu

Keynotes

In addition to the technical works presented it regularly attracts names that bridge the disciplines including (but not limited to):

Keynote speakers from across the years of VISART, images from 2022 public institutional profiles and current affiliation logos.

Keynote speakers from across the years of VISART, images from 2022 public institutional profiles and current affiliation logos.

The inclusion of such a varied collection of Keynote speakers has provided a fruitful discussion on the use of technology to investigate visual content. From its style and perception (Aaron Hertzmann, Adobe) to how is the “hard humanities” field of computer image analysis of art changing our understanding of paintings and drawings (David G. Stork).

VISART VI Keynotes

The VISART VI (2022) workshop continues this tradition of high profile keynotes. It will add Prof Béatrice Joyeux-Prunel of the University of Geneva and Prof Ohad Ben-Shahar of Ben Gurion University to this list and the return of Prof John Collomosse of the University of Surrey.

VISART VI Keynotes

VISART VI Keynotes

Prof Béatrice Joyeux-Prunel

Béatrice Joyeux-Prunel is Full Professor at the University of Geneva in Switzerland (Faculté de Lettres – School of Humanities), chair of Digital Humanities. From 2007 to 2019 she was Associate Professor (maître de conférences) in modern and contemporary art at the École normale supérieure in Paris, France (ENS, PSL). She is a former student of ENS (Alumni 1996, Social Sciences and Humanities), and got an Agrégation in History and Geography in 1999. She defended her PhD in 2005 at the université Paris I Panthéon Sorbonne and her Habilitation at Sciences Po Paris in 2015. Joyeux-Prunel’s research encompasses the history of visual globalisation, the global history of the avant-gardes, the digital technologies in contemporary art, and the digital turn in the Humanities. Since 2009 she has founded and managed the Artl@s project on modern and contemporary art globalisation ([https://artlas.huma-num.fr](https://artlas.huma-num.fr/)) and she coedits the open access journal Artlas Bulletin. In 2016 she founded Postdigital ([www.postdigital.ens.fr](https://visarts.eu/www.postdigital.ens.fr)), a research project on digital cultures and imagination. Since 2019 she has led the European Jean Monnet Excellence Center IMAGO, an international center for the study and teaching on visual globalisation. At Geneva university she directs the SNF Project Visual Contagions ([https://visualcontagions.unige.ch](https://visualcontagions.unige.ch/)), a 4 years research project on images in globalisation, which uses computer vision techniques to trace the global circulation of images in printed material over the 20th century.

Prof John Collomosse

John Collomosse is a Principal Scientist at Adobe Research where he leads the deep learning group. John’s research focuses on representation learning for creative visual search (e.g. sketch, style, pose based search) and for robust image fingerprinting and attribution. He is a part-time full professor at the Centre for Vision Speech and Signal Processing, University of Surrey (UK) where he founded and co-directs the DECaDE multi-disciplinary research centre exploring the intersection of AI and Distributed Ledger Technology. John is part of the Adobe-led content authenticity initiative (CAI) and contributor to the technical work group of the C2PA open standard for digital provenance. He is on the ICT and Digital Economy advisory boards for the UK Science Council EPSRC.

Prof. Ohad Ben-Shahar

Ohad Ben-Shahar is a Professor of Computer Science at the Computer Science department, Ben Gurion University (BGU), Israel. He received his [B.Sc](http://b.sc/). and [M.Sc](http://m.sc/). in Computer Science from the Technicon (Israel Institute of Technology) in 1989 and 1996, respectively, and his M.Phill and PhD From Yale University, CT, USA in 1999 and 2003, respectively. He is a former chair of the Computer Science department and the present head of the School of Brain Sciences and Cognition at BGU. Prof Ben-Shahar’s research area focuses on computational vision, with interests that span all aspects of theoretical, experimental, and applied vision sciences and their relationship to cognitive science as a whole. He is the founding director of the interdisciplinary Computational Vision Laboratory (iCVL), where research involves theoretical computational vision, human perception and visual psychophysics, visual computational neuroscience, animal vision, applied computer vision, and (often biologically inspired) robot vision. He is a principle investigator in numerous research activities, from basic research animal vision projects through applied computer vision, data sciences, and robotics consortia, many of them funded by agencies such as the ISF, NSF, DFG, the National Institute for Psychobiology, The Israeli Innovation Authority, and European frameworks such as FP7 and Horizon 2020.

Call for Papers

The workshop calls for papers on the topics (but not limited to):

  • Art History and Computer Vision
  • 3D reconstruction from visual art or historical sites
  • Multi-modal multimedia systems and human machine interaction
  • Visual Question & Answering (VQA) or Captioning for Art
  • Computer Vision and cultural heritage
  • Big-data analysis of art
  • Security and legal issues in the digital presentation and distribution of cultural information
  • Image and visual representation in art
  • 2D and 3D human pose and gesture estimation in art
  • Multimedia databases and digital libraries for artistic research
  • Interactive 3D media and immersive AR/VR for cultural heritage
  • Approaches for generative art
  • Media content analysis and search
  • Surveillance and Behaviour analysis in Galleries, Libraries, Archives and Museums

Deadlines & Submissions

  • Full & Extended Abstract Paper Submission: 27th May 2022 (23:59 UTC-0)
  • Notification of Acceptance: 30th June 2022
  • Camera-Ready Paper Due: 12th July 2022
  • Workshop: TBA (23-27th October 2022)
  • Submission site: https://cmt3.research.microsoft.com/VISART2022/

Organisers

The VISART VI 2022 edition of the workshop has been organised by:

  • Alessio Del Bue, Istituto Italiano di Tecnologia (IIT)
  • Peter Bell, Philipps-Universität Marburg
  • Leonardo Impett, University of Cambridge
  • Noa Garcia, Osaka University
  • Stuart James, Istituto Italiano di Tecnologia (IIT) & University College London Centre for Digital Humanities (UCL DH)
VISART VI 2022 organisers

VISART VI 2022 organisers

UCLDH co-authored article nominated for Digital Humanities award

Lucy Stagg14 March 2022

An article co-authored by UCLDH team member, Prof Julianne Nyhan and co-author, Dr Alexandra Ortolja-Baird, has been nominated for a Digital Humanities award.

As explained on the Digital Humanities Awards website:

Digital Humanities Awards are a set of annual awards where the public is able to nominate resources for the recognition of talent and expertise in the digital humanities community. The resources are nominated and voted for entirely by the public. The weeding out by the nominations committee is solely based on the criteria of “Is it DH?”,  “Can voters see it?”, “Is it in the right category?”, and “Was it launched/published/majorly updated in that year?”. These awards are intended as an awareness raising activity, to help put interesting DH resources in the spotlight and engage DH users (and general public) in the work of the community. Awards are not specific to geography, language, conference, organization or field of humanities that they benefit. Any suitable resource in any language or writing system may be nominated in any category. DH Awards actively encourages representation from more minority languages, cultures, and areas of DH. All nominated resources are worth investigating to see the range of DH work out there.

There is no financial prize associated with these community awards. The nominations procedure is overseen by an international nominations committee who will decide on final candidates for each category based on whether they meet the above criteria.

The nominated article is available via open access: Encoding the haunting of an object catalogue: on the potential of digital technologies to perpetuate or subvert the silence and bias of the early-modern archive Alexandra Ortolja-Baird, Julianne Nyhan, Digital Scholarship in the Humanities, fqab065, https://doi.org/10.1093/llc/fqab065 (October 2021)

The abstract for the paper summarises:

The subjectivities that shape data collection and management have received extensive criticism, especially with regards to the digitization projects and digital archives of galleries, libraries, archives and museums (GLAM institutions). The role of digital methods for recovering data absences is increasingly receiving attention too. Conceptualizing the absence of non-hegemonic individuals from the catalogues of Sir Hans Sloane as an instance of textual haunting, this article will ask: to what extent do data-driven approaches further entrench archival absences and silences? Can digital approaches be used to highlight or recover absent data? This article will give a decisive overview of relevant literature and projects so as to examine how digital tools are being realigned to recover, or more modestly acknowledge, the vast, undocumented network of individuals who have been omitted from canonical histories. Drawing on the example of Sloane, this article will reiterate the importance of a more rigorous ethics of digital practice, and propose recommendations for the management and representation of historical data, so cultural heritage institutions and digital humanists may better inform users of the absences and subjectivities that shape digital datasets and archives. This article is built on a comprehensive survey of digital humanities’ current algorithmic approaches to absence and bias. It also presents reflections on how we, the authors, grappled with unforeseen questions of absence and bias during a Leverhulme-funded collaboration between the British Museum and University College London (UCL), entitled ‘Enlightenment Architectures: Sir Hans Sloane’s Catalogues of his collections’.

New article ‘Encoding the haunting of an object catalogue’

Lucy Stagg25 November 2021

Alexandra Ortolja-Baird and Julianne Nyhan have co-authored a new article on ‘Encoding the haunting of an object catalogue: on the potential of digital technologies to perpetuate or subvert the silence and bias of the early-modern archive

The article, published in the the Digital Scholarship in the Humanities on 19 October 2021, is available by open access.

Hans Sloane's nautilus shell carved by Johannes Belkien in the late 1600s. C021.7733 ©The Trustees of the Natural History Museum, London

Hans Sloane’s nautilus shell carved by Johannes Belkien in the late 1600s. C021.7733 ©The Trustees of the Natural History Museum, London

The abstract summarises the article as follows:

The subjectivities that shape data collection and management have received extensive criticism, especially with regards to the digitization projects and digital archives of galleries, libraries, archives and museums (GLAM institutions). The role of digital methods for recovering data absences is increasingly receiving attention too. Conceptualizing the absence of non-hegemonic individuals from the catalogues of Sir Hans Sloane as an instance of textual haunting, this article will ask: to what extent do data-driven approaches further entrench archival absences and silences? Can digital approaches be used to highlight or recover absent data? This article will give a decisive overview of relevant literature and projects so as to examine how digital tools are being realigned to recover, or more modestly acknowledge, the vast, undocumented network of individuals who have been omitted from canonical histories. Drawing on the example of Sloane, this article will reiterate the importance of a more rigorous ethics of digital practice, and propose recommendations for the management and representation of historical data, so cultural heritage institutions and digital humanists may better inform users of the absences and subjectivities that shape digital datasets and archives. This article is built on a comprehensive survey of digital humanities’ current algorithmic approaches to absence and bias. It also presents reflections on how we, the authors, grappled with unforeseen questions of absence and bias during a Leverhulme-funded collaboration between the British Museum and University College London (UCL), entitled ‘Enlightenment Architectures: Sir Hans Sloane’s Catalogues of his collections’.

The full article can be read and downloaded at https://doi.org/10.1093/llc/fqab065

UCLDH research activity June 2021

Lucy Stagg30 June 2021

The UCLDH team have been busy as ever, despite continuing COVID-19 restrictions. Here’s just a few examples of recent activity:

Adam Crymble has published a monograph, Technology & the Historian: Transformations in the Digital Age (University of Illinois press, 2021) and a co-authored piece with Maria José Afanador-Llach (‘The Globally Unequal Promise of Digital Tools for History: UK and Colombia Case Study’ in Adele Nye (ed.) Teaching History for the Contemporary World (Springer, 2021), 85-98.).

Oliver Duke-Williams has been doing a lot of engagement work around the 2021 Census, including a radio interview with talkRadio. Read his co-authored blog on the The ebb and flow of UK census data

Julianne Nyhan has had various publications including  Named-entity recognition for early modern textual documents: a review of capabilities and challenges with strategies for the future. (Journal of Documentation, 2021. Co-authored with Marco Humbel, Andreas Vlachidis, Kim Sloan and Alexandra Ortolja-Baird)

Patrick White  has been co-leading a workshop series called Working With Code in collaboration with Research IT services, for Slade students making work in different coding environments such as Godot (game engine), Arduino (micro-controllers), Sonic Pi (live music production based on Ruby), and P5 (JavaScript version of Processing environment).

Tim Williams has been working on the Central Asian Archaeological Landscapes project. Their geospatial database, managed in QGIS, currently comprises 52,408 sites. Of these, 17,123 were known sites, gathered through the digitisation of archival material by our partners in Central Asia, while 35,285 have been digitised from a range of satellite imagery. They are exploring approaches to automatic change detection and Google Earth algorithms for automatic site detection. They are also using historic imagery (CORONA, Google Earth, etc.), DEMs, and scanned and geo-rectified Soviet maps, to create historical map layers, to examine landscape change, destruction, damage and potential threats to archaeological heritage. There is currently over 8TB of clean archival data on UCL Research Data storage, comprising 137,173 files scanned in 6,749 folders. Each folder is a document (notebook, passport folder, envelope with films, etc). This data is linked with the public facing Arches platform and UCL Open Data Repository. As a test, they have very recently placed 6 sets of geospatial data on UCL Research Data Repository (17.45GB) and those have already been viewed 2,540 times, with 1,973 items downloaded. From the repository there are also links to other digital material – for example 3D models on Sketchfab.

Semantic Web for Cultural Heritage special issue now published

Lucy Stagg1 February 2021

UCLDH member, Antonis Bikakis, has co-edited (along with four other colleagues from France, Italy and Finland) the Semantic Web for Cultural Heritage special issue, which is now published and freely available to read in Semantic Web- Interoperability, Usability, Applicability, volume 12(2), 2021.

The papers cover a wide spectrum of modelled topics related to language, reading and writing, narratives, historical events and cultural artefacts, while describing reusable methodologies and tools for cultural data management.

 

 

Enlightenment architectures: The reconstruction of Sir Hans Sloane’s cabinets of ‘Miscellanies’

Lucy Stagg9 December 2020

UCLDH Director Dr Julianne Nyhan and Dr Kim Sloan, the Francis Finlay Curator of the Enlightenment Gallery at the British Museum, have recently had an article published by the OUP Journal of the History of Collections. The article is based on their work undertaken as part of the Leverhulme Trust-funded research project, Enlightenment Architectures: Sir Hans Sloane’s Catalogues of his Collections (2016–19), a collaboration between the British Museum and University College London. Abstract:

Focusing on Sir Hans Sloane’s catalogue of ‘Miscellanies’, now in the British Museum, this paper asks firstly how Sloane described objects and secondly whether the original contents of the cabinets can be reconstructed from his catalogue. Drawing on a sustained, digitally augmented analysis – the first of its kind – of Sloane’s catalogues, we respond to these questions and offer an initial analysis of the contents of the cabinets that held the miscellaneous objects at Sloane’s manor house in Chelsea. Knowledge of how and why Sloane catalogued this part of his collection has hitherto remained underdeveloped. We argue that his focus on preservation and documentation in his cataloguing did not preclude a research role, but rather was founded on immersive participation.

The full article, Enlightenment architectures: The reconstruction of Sir Hans Sloane’s cabinets of ‘Miscellanies’, is available to read for free.

New report on ‘Sustaining Digital Humanities in the UK’

Lucy Stagg9 October 2020

This report, published by the Software Sustainability Institute (SSI), lists a set of recommendations for SSI to further its activity in and engagement with the Digital Humanities community in the UK.

SSI’s aim is to develop better research software, at a time where digital methods and infrastructure are becoming increasingly important within the arts and humanities research landscape.

The report was led by Giles Bergel and Pip Willcox, with contributions from a number of other academics including our new Director, Julianne Nyhan.

The full report is freely available to read and download: Sustaining the Digital Humanities in the UK.

The Atlas of Digitised Newspapers and Metadata: Reports from Oceanic Exchanges.

Julianne Nyhan7 February 2020

Beals, M. H. and Emily Bell, with contributions by Ryan Cordell, Paul Fyfe, Isabel Galina Russell, Tessa Hauswedell, Clemens Neudecker, Julianne Nyhan, Mila Oiva, Sebastian Padó, Miriam Peña Pimentel, Lara Rose, Hannu Salmi, Melissa Terras, and Lorella Viola. The Atlas of Digitised Newspapers and Metadata: Reports from Oceanic Exchanges. Loughborough: 2020. DOI: 10.6084/m9.figshare.11560059

The Oceanic Exchanges team has just published a substantial open access resource that will advance the state of the art of the cross-collection text analysis of selected North-Atlantic and Anglophone-Pacific retrodigitised nineteenth-century newspapers. We also hope that the approach set out in the report will be taken up by other researchers who wish to engage in foundational research on approaches to cross-collection computational analysis. As the project notes:

the rise of digitisation promises great opportunities for those who wish to engage with newspaper archives, but as with all historical archives, digital collections require researchers to be mindful of their shape, provenance and structure before any conclusion can be drawn. It is the responsibility of both digitiser and researcher to understand both the map and the terrain (see here).

The numerous newspaper digitisation projects that have been undertaken in recent years have resulted in the remediation of many millions of pages of nineteenth-century newspapers. Yet, those researchers who wish to pursue questions about global history, for example, have often found it difficult to carry out data-driven research across those digitised collections. As our report discusses, there are many reasons for this, including how digitisation projects are often undertaken in national settings but newspapers often participate in global conversations;  standards that can overarch and integrate numerous, disparate digital newspaper collections have not been implemented; the shape and scope of digitised newspaper collections is informed by a multiplicity of situated contexts which can be difficult for those who are external to digitisation projects to establish; also, though digital newspapers are often encoded in line with METS/ALTO, for example, notable variations exist in how those metadata specifications are applied to digital newspaper collections exist.

To respond to this, and to further research that takes place across digital newspaper collections, this 200 page report brings together qualitative data, metadata and paradata about selected digitised newspaper databases. It provides crucial historical and contextual information about the circumstances under which those collections came into being. It provides a textual ontology that describes the relationships between the informational units of which the respective databases are comprised, between the data and metadata of the different collections and on the interrelationships between analogue newspapers and their retrodigitised representations. Also included are maps which support the visual inspection and comparison of data across disparate newspaper collections alone with JSON or xpath paths to the data.

This report has come about in the context of the Oceanic Exchanges (2017-19) project  (of which UCLDHers Julianne Nyhan was UK PI and Tessa Hauswedell was UCL Research Associate). The project was funded through the Transatlantic Partnership for Social Sciences and Humanities 2016 Digging into Data Challenge, and brought together leading efforts in computational periodicals research from six countries—Finland, Germany, Mexico, the Netherlands, the United Kingdom, and the United States—to examine patterns of information flow across national and linguistic boundaries.

The project is also immensely grateful to the many groups and organisations involved in the digitisation of historical newspapers who agreed to be interviewed and consulted during the process of researching the report. You can find the report, metadata maps and other resources here: https://www.digitisednewspapers.net/

 

 

New open access publication

Julianne Nyhan22 November 2019

MS 3972C vol. VI, f.7. British Library (Public domain in most countries except the UK). An annotated extract from Sloane’s catalogue of printed material showing composite parts of individual catalogue entries. For readability we have dropped the enlightenment namespace prefix.

MS 3972C vol. VI, f.7. British Library (Public domain in most countries except the UK). An annotated extract from Sloane’s catalogue of printed material showing composite parts of individual catalogue entries. For readability we have dropped the enlightenment architectures namespace prefix.

Julianne Nyhan, UCLDH Deputy-Director, and colleagues in the Leverhulme-funded ‘Enlightenment Architectures: Sir Hans Sloane’s catalogues of his collections’ (2016–19), recently published a substantial open access article in the Open Library of Humanities.

This article presents a significant aspect of the work of the ‘Enlightenment Architectures’ project, a collaboration between the British Museum and University College London including contributions from the British Library and the Natural History Museum. The project investigates Sir Hans Sloane’s (1660-1753) original handwritten catalogues of his collections in order to understand their highly complex information architecture and intellectual legacies. To do so, the project has employed computational analysis to examine how Sloane’s catalogues are composed and the way their structure and content relate to the world from which his collections were assembled – the first substantial example of such an approach.

Digital Humanities in the Memory Institution addresses some of the challenges of seeking to integrate the methods of digital humanities with those of cataloguing, inventory, curatorial and historical studies, especially in the context of early modern documentary sources. Focusing on two case studies which exemplify the complexities of encoding Hans Sloane’s catalogues in accordance with the Text Encoding Initiative (TEI), the article sheds light on both the technical and epistemological challenges of encoding early modern catalogues, while emphasising the new questions and perspectives that arise from such complications. Most strikingly, the article draws attention to the parallels between early modern and current classification systems, and the on-going dilemma of how best to use language to describe objects.

Digital Humanities in the Memory Institution has resonance for the institutions, individuals and communities alike who research, curate, archive and simply even browse digital heritage collections.

See: Digital Humanities in the Memory Institution: The Challenges of Encoding Sir Hans Sloane’s Early Modern Catalogues of His Collections. Ortolja-Baird, A., Pickering, V., Nyhan, J., Sloan, K. and Fleming, M., Open Library of Humanities, 5(1), 2019, p. 44. DOI: http://doi.org/10.16995/olh.409

 

Corpus analysis reveals ‘Routine politeness in American and British English requests’

Lucy Stagg23 April 2019

UCLDH team member, Dr. Rachele De Felice, has had an article published in the Journal of Politeness Research.

Co-authored with M. Lynne Murphy, the article is entitled Routine politeness in American and British English requests: use and non-use of please (Journal of Politeness Research 15(1), 77-100). The article extract explains further:

This paper looks at the use and non-use of please in American and British English requests. The analysis is based on request data from two comparable workplace email corpora, which have been pragmatically annotated to enable retrieval of all request speech acts regardless of formulation. 675 requests are extracted from each of the two corpora; the behaviour of please is analyzed with regard to factors such as imposition level, sentence mood, and modal verb type.

Rachele’s research is in the field of corpus pragmatics. It focuses on speech act annotation and the creation of pragmatic profiles of Business English by applying corpus analysis and natural language processing (NLP) techniques to large collections of real-world data.