Project Goal
The goal of the project is to build an extension to the Mediawiki software which will select and reformat articles from an existing Mediawiki website for export to a new Mediawiki website. See the attached specifications for more details.
Qualifications for a Good Candidate:
) Have strong skills for Mediawiki development (XML, PHP, MySQL, etc.)
) Be able to communicate effectively in English.
) Be resourceful and efficient. For example, you can quickly learn about Mediawiki and learn from the “open source” community for development information and resources (see link below under “Supporting Information”).
) Have a good computer for development and fast internet connection.
) Work well with other people.
) Be dependable and honest.
Qualifications for an Excellent Candidate:
) Have excellent experience developing wikis and specifically Mediawiki. Know how to work with the MediaWiki (Wikipedia) data files (see link below).
) Be located in Kyiv, Ukraine (not required, but a big plus).
Supporting Information:
) For information about Wikipedia, see [login to view URL]
) For information about the Mediawiki software, see [login to view URL]
) Support for Mediawiki users and developers can be found at [login to view URL]
) Examples of data used in the first project can be found at [login to view URL]
Future Work
) Other related and new projects are planned after this project which may be available for the correct candidate.
NOTE
) Bids can be over $300 if necessary.
I'm currently developing extensions to Mediawiki. I have extensive experience with the Mediawiki's architecture. I've recently developed code to import and export images into and out of Mediawiki under program control. I regularly extend and replace Special pages and understand much of the internal object model. I also have experience with the access control code; I have extended Mediawiki's access control to other applications.
I assume the tagging of the original data for export to another instance of Mediawiki can be done in a way similar to the way Wikipedia currently is tagging their data as [login to view URL] takes over many of the wiki software organizational responsibilities.
I haven't investigated whether the specific options you've indicated including limiting the export of images, links to other articles, etc. already have been provided for. I assume you have direct access to the existing Wikipedia systems upon which the tagging will be done.
The final system will contain a catalog of all exported pages that will be used for periodic re-synchronizing. I like the idea of the tree structured diagram within which check boxes will be used to indicate which pages will be exported.
I assume that all exports will use a variation of the standard Wikipedia XML DTD. Additional attributes can be included in the XML entry for each article to indicate sub-articles and images.
Interwiki links can be manipulated if there's a need to retarget imported articles to links within other imported pages.
I can go over a more detailed list of the pages that will be required at the UI level as well as your thoughts on how exports as well as regularly scheduled re-exports will be done for ongoing synchronization.
Please pardon me if my assuptions are off-base. I'm trying to be as thorough as possible in a limited response. I look forward to working with you
Thanks.
Bill