Digital Humanities Pedagogy: Practices, Principles and Politics. Ontologydrivensoftwaredevelopment. Digital Preservation Metadata for Practitioners. A Macroscope for Global History: Seshat Global History Databank, a methodological overview. Abstract This article introduces the "Seshat: Global History" project, the methodology it is based upon and its potential as a tool for historians and other humanists.
Seshat is a comprehensive dataset covering human cultural evolution since the Neolithic. Codespeakkit/bibliography.md at master · scholarslab/codespeakkit. » Demystifying Networks, Parts I & II Journal of Digital Humanities. Scott B.
Weingart Part 1 of n: An Introduction This piece builds on a bunch of my recent blog posts that have mentioned networks. Elijah Meeks already has prepared a good introduction to network visualizations on his own blog, so I cover more of the conceptual issues here, hoping to reach people with little-to-no background in networks or math, and specifically to digital humanists interested in applying network analysis to their own work. On Building. I’ve said a few controversial things over the course of my career, and it seems to me that if you are so honored as to have other people talking about what you said, you should probably sit back and let people respond without trying to defend yourself against every countercharge.
But I’m worried that my late remarks at mla 11 are touching a nerve in a way that is not provocative (in the good sense), but blithely exclusionary. Google’s AI translation tool seems to have invented its own secret internal language. All right, don’t panic, but computers have created their own secret language and are probably talking about us right now.
Well, that’s kind of an oversimplification, and the last part is just plain untrue. But there is a fascinating and existentially challenging development that Google’s AI researchers recently happened across. You may remember that back in September, Google announced that its Neural Machine Translation system had gone live. It uses deep learning to produce better, more natural translations between languages. Cool! Following on this success, GNMT’s creators were curious about something.
02 14.content. Who's In and Who's Out. [I’m pleased to offer a transcript of my pithy, underdeveloped position paper at the “History and Future of Digital Humanities” panel at the 2011 mla.
The Changing Culture of Humanities Scholarship: Iteration, Recursion, and Versions in Scholarly Collaboration Environments. Susan Brown University of Guelph & University of Alberta John Simpson University of Alberta the INKE Research Team, & CWRC Project Team Susan Brown is Professor of English at the University of Guelph and Visiting Professor in English and Film Studies, and Humanities Computing, at the University of Alberta.
Email: email@example.com John Simpson holds a PhD in Philosophy and is a postdoctoral fellow at the University of Alberta pursuing research on the Semantic Web and Linked Open Data. INKE Research Team: Implementing New Knowledge Environments is a Major Collaborative Research Initiatives research grant funded by the Social Sciences and Humanities Research Council of Canada. Facebook (FB) is scrambling to catch up to Google (GOOG) in open-sourcing artificial intelligence code — Quartz. In artificial intelligence research, free code garners goodwill from the community, talent, and bragging rights.
So it’s no surprise that many of the companies investing in AI, like Facebook and Google, are racing to make their code open source early and often. On Aug. 25, Facebook announced that it would make public three tools integral to its image recognition software—the same AI that automatically tags photos and helps read the content of images to visually impaired users of its site.
The social media company says that this kind technology could allow Facebook users to search for photos based on what they depict, without relying on the tags others had assigned to the images. Facebook also claims it could be used to identify the nutritional value of food just by taking a picture of it. The tools are powerful; they provide the entire framework to pick apart the elements of a photo and label the separate parts. Digital_Humanities: THE Book! Anne Burdick, Johanna Drucker, Peter Lunenfeld, Todd Presner and Jeffrey Schnapp, Digital_Humanites, Cambridge 2012 A short cut and some remarks “Two decades ago, working with digital documents was the exception.
Today it is the norm” – Anne Burdick and her co-authors write; and: “If the humanities are to thrive and not just exist in niches of privilege, they will have to visibly demonstrate the contributions to knowledge and society they are making in the digital era”. This extremely interesting book from the MIT studies emerging methods and genres, cases as Geographical Information Systems (GIS), new textual corpuses and virtual reconstructions, deals with “the social life of the Digital Humanities”, confronts us with some “provocations” and ends with a short practical guide to the emerging Digital Humanities. This book will change some basic guidelines for the perception of the production of knowledge, so it seems. Tools are not just tools, the authors say. X x x. Big Data in History. Patrick Manning, Big Data in History, London 2013 The author, Patrick Manning, is Professor of World History at the University of Pittsburgh.
He pursues a big goal: As a Director of the Center for Historical Information and Analysis (CHIA) he wants to develop and build up a world wide historical archive, thinking that time has come to create a coherent record of human social change. He compares his project to those in climate modeling and genetic databases. The small book is an introduction into the project of the CHIA, that exists since 2007. The global dataset on human societal activities should include four to five centuries. Professor Manning’s challenge is huge: there are big quantities of complex data to be collected and processed. Digital%20Humanities%20and%20Crowdsourcing%20 %20An%20Exploration.