Dear users, developers and all people interested in semantic wikis,
We are happy to announce SMWCon Fall 2013 - the 8th Semantic MediaWiki
Conference:
* Dates: October 28th to October 30th 2013 (Monday to Wednesday)
* Location: A&O Berlin Hauptbahnhof, Lehrter Str. 12, 10557 Berlin, Germany
* Conference wikipage: https://semantic-mediawiki.org/wiki/SMWCon_Fall_2013
* Participants: Everybody interested in semantic wikis, especially in
Semantic MediaWiki, …
[View More]e.g., users, developers, consultants, business
representatives, researchers.
SMWCon Fall 2013 will be supported by the Open Semantic Data
Association e. V. [1]. Our platinum sponsor will be WikiVote ltd,
Russia [2].
Following the success of recent SMWCons, we will have one tutorial day
and two conference days.
Participating in the conference: To help us planning, you can already
informally register on the wikipage, although a firm registration will
later be needed.
Contributing to the conference: If you want to present your work in
the conference please go to the conference wikipage and add your talk
there. To create an attractive program for the conference, we will
later ask you to give further information about your proposals.
Tutorials and presentations will be video and audio recorded and will
be made available for others after the conference.
==Among others, we encourage contributions on the following topics==
===Applications of semantic wikis===
* Semantic wikis for enterprise workflows and business intelligence
* Semantic wikis for corporate or personal knowledge management
* Exchange on business models with semantic wikis
* Lessons learned (best/worst practices) from using semantic wikis or
their extensions
* Semantic wikis in e-science, e-learning, e-health, e-government
* Semantic wikis for finding a common vocabulary among a group of people
* Semantic wikis for teaching students about the Semantic Web
* Offering incentives for users of semantic wikis
===Development of semantic wikis===
* Semantic wikis as knowledge base backends / data integration platforms
* Comparisons of semantic wiki concepts and technologies
* Community building, feature wishlists, roadmapping of Semantic MediaWiki
* Improving user experience in a semantic wiki
* Speeding up semantic wikis
* Integrations and interoperability of semantic wikis with other
applications and mashups
* Modeling of complex domains in semantic wikis, using rules, formulas etc.
* Access control and security aspects in semantic wikis
* Multilingual semantic wikis
If you have questions you can contact me (Yury Katkov, Program Chair),
Benedikt Kämpgen (General Chair) or Karsten Hoffmeyer (Local Chair)
per e-mail (Cc).
Hope to see you in Berlin
Yury Katkov, Program Chair
[1] http://www.opensemanticdata.org/
[2] http://wikivote.ru
[View Less]
Dear all,
I am happy to announce the very first release of Wikidata Toolkit [1],
the Java library for programming with Wikidata and Wikibase. This
initial release can download and parse Wikidata dump files for you, so
as to process all Wikidata content in a streaming fashion. An example
program is provided [2]. The libary can also be used with MediaWiki
dumps generated by other Wikibase installations (if you happen to work
in EAGLE ;-).
Maven users can get the library directly from …
[View More]Maven Central (see [1]);
this is the preferred method of installation. There is also an
all-in-one JAR at github [3] and of course the sources [4].
Version 0.1.0 is of course alpha, but the code that we have is already
well-tested and well-documented. Improvements that are planned for the
next release include:
* Faster and more robust loading of Wikibase dumps
* Support for various serialization formats, such as JSON and RDF
* Initial support for Wikibase API access
Nevertheless, you can already give it a try now. In later releases, it
is also planned to support more advanced processing after loading,
especially for storing and querying the data.
Feedback is welcome. Developers are also invited to contribute via github.
Cheers,
Markus
[1] https://www.mediawiki.org/wiki/Wikidata_Toolkit
[2]
https://github.com/Wikidata/Wikidata-Toolkit/blob/v0.1.0/wdtk-examples/src/…
[3] https://github.com/Wikidata/Wikidata-Toolkit/releases
(you'll also need to install the third party dependencies manually when
using this)
[4] https://github.com/Wikidata/Wikidata-Toolkit/
[View Less]
Hi there,
I'm sorry - I really, properly am - for this spamming, but it's also
something that might interest the Wikidata developing team.
I and a couple of other users (if selected) are going to hold a
presentation at Wikimania 2014 about a project conducted by Wikimedia
Italy and the Europeana network of Ancient Greek and Latin Epigraphy
(EAGLE). The full description is here:
https://wikimania2014.wikimedia.org/wiki/Submissions/W%28iki%29B%28ase%29_l…
TL;…
[View More]DR: Wikimedia Italy and EAGLE are using Wikibase extensions for
building up a database about Ancient Greek and Latin epigraphy,
getting the data from various universities and institutions... and the
thing is working! :) Of course, those data are in CC0, and there are
also plans to donate those data to Wikimedia community when the
Commons-Wikidata integration will be completed.
This should also be the first project outside the WMF perimeter to use
Wikibase for such a project (GerardM, please correct me if I'm wrong).
If you're interested in it, you might want to take a peek at it. :)
Sorry again for spamming!
Cheers,
--
Luca "Sannita" Martinelli
http://it.wikipedia.org/wiki/Utente:Sannita
[View Less]
Hi all,
some weeks ago Anja asked me to send an email to this list with requirements from DBpedia regarding the Pubsubhubbub feed.
We are really happy that finally somebody started working on this.
The main thing DBpedia needs is the software to create an up-to-date mirror of each language version of Wikipedia . All other requirements can be deduced from this one.it would be bad for us, If this is out of scope or not working correctly at the end of the project.
All the best,
Sebastian
--
…
[View More]Sent from my Android device with K-9 Mail. Please excuse my brevity.
[View Less]
Hello everyone,
I have applied for GSoC 2014 with Mediawiki aiming to create a plugin that
can annotate statements in various websites and feed then as statements (
with references taken as website url and author in the case of Google
books, Wikisource etc.
Project Proposal is currently hosted at :
https://www.mediawiki.org/wiki/Wikidata_annotation_tool
I need to get more feedback from community such as what extra features we
require
from the tool which can …
[View More]prove this project more useful. I hope this project
is considered useful by the community.
So kindly, take a look at proposal and provide any valuable comments you
can.
Thanks
Note:
This mail has been crossposted to wikitech-l and wikidata-tech
--
Amanpreet Singh,
IIT Roorkee
[View Less]
Hey everyone :)
Since Adam's internship with us is coming to an end we are looking for
a new amazing intern or working student to help us out around
Wikidata. You can find all the details at
http://wikimedia.de/wiki/Working_student_Wikidata_(f/m). If you have
any questions please let me know.
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
…
[View More]Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
[View Less]