CC license version 4.0: Helping meet the needs of open data publishers and users. Over the last few months, Creative Commons has been working on the next version of its license suite, version 4.0. The goals of version 4.0 are wide-ranging, but the overall objective is clear: update the licenses so they are considerably more robust, yet easy to understand and use, for both existing communities and new types of users. A key community that version 4.0 aims to serve better are public sector agencies releasing data.
Public sector information can be of great value, but the public needs to know what they can do with it. At the same time, public sector agencies need to be reassured that they can offer data in a way that gives them credit, maintains their reputation, and ensures some level of data integrity. Sui generis database rights One area of particular interest to European data publishers and users will be the shift in how CC licenses handle sui generis database rights. Here’s an example. Strengthen reputation and integrity Updated attribution. Cloud Computing In Europe Really Needs A Boost. Information. The mergy notes | hopefully some useful content, information, hacks, and solutions.
Public.dhe.ibm.com/common/ssi/ecm/en/itw03003gben/ITW03003GBEN. What is big data? Big data is data that exceeds the processing capacity of conventional database systems. The data is too big, moves too fast, or doesn’t fit the strictures of your database architectures. To gain value from this data, you must choose an alternative way to process it. The hot IT buzzword of 2012, big data has become viable as cost-effective approaches have emerged to tame the volume, velocity and variability of massive data. Within this data lie valuable patterns and information, previously hidden because of the amount of work required to extract them. To leading corporations, such as Walmart or Google, this power has been in reach for some time, but at fantastic cost.
Today’s commodity hardware, cloud architectures and open source software bring big data processing into the reach of the less well-resourced. Big data processing is eminently feasible for even the small garage startups, who can cheaply rent server time in the cloud. What does big data look like? Volume Velocity Why is that so? The Power of Open. Digital Fluency. @MarsCuriosity: People’s Insights: Volume 1, Issue 29.
What is @MarsCuriosity? @MarsCuriosity is the Twitter account of NASA’s latest robot rover on Mars. @MarsCuriosity burst into the spotlight as the NASA social media team live tweeted its successful landing on Mars on August 5/6, 2012. Curiosity is on a mission to find out if life ever existed on the red planet, and to collect data for a future manned mission to Mars. The rover runs on a combination of nuclear and solar energy, and has been designed to operate for two years on the surface of Mars. Curiosity launched from Florida, and travelled eight months and 566 million kilometres to Mars.
Transmedia Storytelling In 2008, the three-person social media team created a Twitter account. The first tweet from @MarsCuriosity: “I’m WAY cool, nearly built, and I need a name. @MarsCuriosity also curated content generated and interviews about her mission to Mars. “This week, I’ve been testing my newly attached arm & practicing hand-eye coordination. “I HAVE LIFTOFF!” Importance for NASA. The Global Education Conference Network - GlobalEdCon: Connecting Educators and Organizations Worldwide. What IT leaders say about big data. Data is the lifeblood of an organisation but its flow needs to be managed effectively.
The challenge for IT chiefs is that data volumes are getting bigger with the explosion in the use of unstructured data and social media sites, such as Facebook and Twitter, holding valuable data organisations want to effectively exploit. The desire to sweat information assets for possible competitive advantage is being fuelled by the tough economic climate, but many IT leaders are struggling over how to extract value and empower business users in the face of a big data skills deficit and concerns over the governance, privacy and security of data. IT leaders met at a recent Computer Weekly roundtable event, in association with Oracle, to discuss the possibilities and challenges of big data.
Where to use big data Many of the IT leaders at the event were at the exploratory stage of using big data technologies, but can see its potential. Customer engagement But he says there is an issue of verity. Email Alerts. 4GIP Network Evolution, Videos | TIA NOW. 3rd Usage Area Workshop: Future Internet Initiative. Date: 28 and 29 June 2011 Venue: Hotel Husa President Park, Brussels, Belgium Co-organised by the EX-FI Support Action, the CONCORD Support Action and the European Commission Invitation Following the successful 2nd Future Internet Usage Areas workshop in Brussels last year, the EX-FI project, the CONCORD project, the EFIA Industry group and European Commission now would like to invite you to the 3rd Usage Areas workshop.
This workshop aims to cultivate a common discussion between the stakeholders from all sectors using, and planning to use, the services and facilities of Future Internet (FI) to build an extended FI community and to foster a holistic cross sector approach to accelerating the Future Internet in Europe.The workshop will enable a detailed discussion on the Usage Areas requirements and in particular how they can be part of the bigger picture of Future Internet evolution in Europe.
Themes to be addressed: The themes and open questions to be addressed during the Workshop include: CORDIS: Technology Marketplace. VISION Cloud - Fullstory. mOSAIC Cloud. Higgs boson: landmark announcement clears key hurdle. The announcement two months ago that physicists have discovered a particle consistent with the famous Higgs boson cleared a formal hurdle on Monday with publication in a peer-reviewed journal. Two laboratories working at the European Organisation for Nuclear Research (CERN) had jointly announced on July 4 they had detected a new fundamental particle in experiments at the Large Hadron Collider near Geneva. The discovery has been hailed as one of the biggest scientific achievements ever. The teams, from labs called Atlas and the Compact Muon Solenoid (CMS), on Monday each published their findings in the European journal Physics Letters B. They are available at www.sciencedirect.com/science/article/pii/S037026931200857X and www.sciencedirect.com/science/article/pii/S0370269312008581 .
Although CERN's announcement was never doubted, it still had to be vetted by peers and then published in an established journal to meet benchmarks of accuracy and openness. Voyager’s long goodbye. Are we there yet? Ed Stone, the project scientist for NASA’s two Voyager spacecraft, wants to know. Since their launch in 1977, the probes have ventured billions of kilometres beyond the outer planets. Now, Stone and his colleagues are looking for signs that Voyager 1 may finally be nearing the edge of the Solar System — where the heliosphere, the bubble of electrically charged particles blown outwards by the Sun, gives way to interstellar space (see ‘Edging into the unknown’).
Detecting and characterizing this threshold — called the heliopause — would be the ultimate bonus for a probe that logged its 35th year in space on 5 September. When Voyager 1 set out, says Stone, a physicist at the California Institute of Technology in Pasadena, who has coordinated the mission since the probes launched, “the space age was only 20 years old and there was no evidence that any spacecraft could travel this long and this far from the Sun”.
NASA/JPL-Caltech Voyager 1 was launched in 1977. Home - Cloud4all. Think through your cloud plans -- or else | Cloud Computing. Most enterprise IT organizations focus more on technology than on thinking -- a sad tendency I've often pointed out. Thus, I was happy to see InfoQ's Mark Little review an article by Steve Jones of CapGemini. Both see the same lack of thought in how enterprises use technology. In fact, it's worse than not thinking -- there's an active dislike of deeper consideration that gets expressed as ignoring or even disparaging planning, architecture, and design in IT. This sorry state is quite evident as cloud computing begins to take hold in the standard IT technology arsenal. The fact of the matter is that there are two worlds. One involves the hype and good feelings about next-gen IT, such as cloud computing, that tells us the technology itself will save us from the mistakes of the past.
Then there's the world of planning, architecture, and design that makes the technology actually useful -- despite IT's aversion to this crucial stage. .: Venus-C :. Building a highly-scalable and flexible Cloud infrastructure VENUS-C is a project funded under the European Commission’s 7th Framework Programme drawing its strength from a joint co-operation between computing service providers and scientific user communities to develop, test and deploy a large, Cloud computing infrastructure for science and SMEs in Europe.
Experts "The great success of Generalized Worker Role in VENUS-C has emphasized to me that building a library of customizable Roles or Appliances is a powerful cloud programming model". "It was particularly revealing to see so many different applications being in need of cloud services, yet with rather diverse characteristics and diverse requirements". "I have improved my knowledge about commercial and academic cloud activities and interoperability topics, and increased my understanding of industrial cloud applications". VENUS-C Value Proposition On Air. IRMOS - Home. Three kinds of big data. In the past couple of years, marketers and pundits have spent a lot of time labeling everything ”big data.”
The reasoning goes something like this: Everything is on the Internet.The Internet has a lot of data.Therefore, everything is big data. When you have a hammer, everything looks like a nail. When you have a Hadoop deployment, everything looks like big data. And if you’re trying to cloak your company in the mantle of a burgeoning industry, big data will do just fine.
We saw this with cloud computing. So where will big data go to grow up? Once we get over ourselves and start rolling up our sleeves, I think big data will fall into three major buckets: Enterprise BI, Civil Engineering, and Customer Relationship Optimization. Enterprise BI 2.0 For decades, analysts have relied on business intelligence (BI) products like Hyperion, Microstrategy and Cognos to crunch large amounts of information and generate reports. Most “legacy” BI tools are constrained in two ways: Civil Engineering Related: Welcome to TClouds. OpenAIRE. Linked Data Basics for Techies - OpenOrg. Intended Audience This is intended to be a crash course for a techie/programmer who needs to learn the basics ASAP. It is not intended as an introduction for managers or policy makers (I suggest looking at Tim Berners-Lee's TED talks if you want the executive summary).
It's primarily aimed at people who're tasked with creating RDF and don't have time to faff around. It will also be useful to people who want to work with RDF data. RDF is a data structure perfect for people creating mash-ups! Please Feedback-- especially if something doesn't make sense!!!! If you are new to RDF/Linked Data then you can help me!
I put a fair bit of effort into writing this, but I am too familar with the field! If you are learning for the first time and something in this guide isn't explained very well, please drop me a line so I can improve it. cjg@ecs.soton.ac.uk Warning Some things in this guide are deliberately over-simplified. Alternatives (suggest more!) Structure Tree data: (JSON, XML.) Graph data: (RDF). RDFa. 5 star Open Data. What Is Digital Humanities and What’s it Doing in the Library? Tl;dr – Libraries and digital humanities have the same goals. Stop asking if the library has a role, or what it is, and start getting involved in digital projects that are already happening. Advocate for new expanded roles and responsibilities to be able to do this. Become producers/creators in collaboration with scholars rather than servants to them.
Comprehending the Digital Humanities – from Elijah Meeks at Stanford. Introduction – On Kirschenbaum In the spring of 2011, Matthew Kirschenbaum, Professor of English at the University of Maryland, published a piece for the Association of Departments of English titled “What Is Digital Humanities and What’s it Doing in English Departments?” Aside from the complications of defining what is/are/is-not digital humanities, it is in this publicly visible, collaborative, online network and infrastructure that the Library should begin to see itself.
What You Do with a Million Books, Screwmaneutically Speaking: The Library as Place – On Ramsay. Tim Berners-Lee on the next Web. Tom Heath - Home. OLiA - Datasets. Open Definition. Projekt GRAMMIS. Projektbeschreibung, Projektziele Nutzerbestimmung, Adressaten Theoretischer Rahmen Methodik, Datengrundlage Aktivitäten Kooperation Internet-Service des Projekts Publikationen Projektbeschreibung, Projektziele Das Projekt hat zum Ziel, ein umfassendes multimediales elektronisch vernetztes Informationssystem zur deutschen Grammatik bereitzustellen, auf das weltweit über das Internet zugegriffen werden kann. Das System soll der Erschließung wichtiger Ergebnisse der Arbeiten der Abteilung Grammatik des IDS für die breite Öffentlichkeit dienen. Das bisher erarbeitete GRAMMIS enthält vier inhaltliche Komponenten.
Das Kernstück des Systems ist die Systematische Grammatik In dieser Komponente wird versucht, ein hierarchisch strukturiertes Gesamtbild der Grammatik der deutschen Gegenwartssprache zu entwerfen. Aus syntaktischer Sicht werden Ausdruckseinheiten jeder Komplexitätsstufe beschrieben. Weitere Komponenten von GRAMMIS sind: Nutzerbestimmung, Adressaten Theoretischer Rahmen Aktivitäten. LREC Conferences. Orchestrating new forms of learning using technology. RDF - Semantic Web Standards. Overview RDF is a standard model for data interchange on the Web. RDF has features that facilitate data merging even if the underlying schemas differ, and it specifically supports the evolution of schemas over time without requiring all the data consumers to be changed.
RDF extends the linking structure of the Web to use URIs to name the relationship between things as well as the two ends of the link (this is usually referred to as a “triple”). Using this simple model, it allows structured and semi-structured data to be mixed, exposed, and shared across different applications. This linking structure forms a directed, labeled graph, where the edges represent the named link between two resources, represented by the graph nodes. Recommended Reading The RDF 1.1 specification consists of a suite of W3C Recommendations and Working Group Notes, published in 2014. A number of textbooks have been published on RDF and on Semantic Web in general. Discussions on a possible next version of RDF. Datasets / NLP. Each and every dataset from DBpedia is potentially useful for several Natural Language Processing (NLP) tasks.
We describe here a few examples of how to use these datasets. Moreover, we describe a number of extended datasets that were generated during the creation of DBpedia Spotlight and other NLP-related projects. In the context of this page, the word “resource” — as in DBpedia Resource — refers to an entity or concept identified by a DBpedia URI. 1. The core datasets from DBpedia include an ontology to model the extracted information from Wikipedia, general facts about extracted resources, as well as inter-language links. 2. The NLP Datasets were created by the DBpedia Spotlight team to support entity recognition and disambiguation tasks, among others.
Pablo N. 2.1. Contains mappings between surface forms and URIs. Created by the DBpedia Spotlight team. Download. Has been used by: DBpedia Lookup, DBpedia Spotlight Example Data: 2.2. Created by the DBpedia Spotlight team. Apple_Inc. 3. BIS : Sebastian Hellmann. 1st Workshop on the Multilingual Semantic Web | 1st Workshop on the Multilingual Semantic Web.
Richard.cyganiak.de/2007/10/lod/imagemap.html. Richard.cyganiak.de/2007/10/lod/lod-datasets_2011-09-19_colored.html. Interlinking. Linked Data: Evolving the Web into a Global Data Space.