# WikipediaFS

Developer(s) Mathieu Blondel 11 июнь 2006 (single_instance_storage) Mac OS X, Linux, FreeBSD http://wikipediafs.sourceforge.net

WikipediaFS is a mountable Linux virtual file system that allows to read and edit articles from Wikipedia (or any Mediawiki-based site) as if they were real files.

It is thus possible to view and edit articles using your favourite text-editor. Text-editors tend to be more convenient than a simple browser form when it comes to editing large texts and they generally include useful features such as Mediawiki syntax highlighting and spell checking.

Work on the project suspended from 2007

## Installation and first start

Command to install the package:

%# apt-get install wikipediafs


This package will be installed automatic with first one:

• libfuse2
• fuse-utils
• python-fuse

After installation you will be able to make first start :

%$mount.wikipediafs fuse: missing mountpoint filesystem initialization failed  Despite of error,programm have made a configuration file, so we need to change it and specify the connection settings for the wiki-media site. igor@chub:~$ ls /home/igor/.wikipediafs/
config.xml  wikipediafs.log


For example, to work with bmstu.wiki configuration file will look like:

<?xml version="1.0" encoding="UTF-8"?>
<wfs-config>
<general>
<article-cache-time>30</article-cache-time>
</general>
<sites>
<site>
<dirname>xgu.ru</dirname>
<host>ru.bmstu.wiki</host>
<basename>/w/index.php</basename>
</site>
</sites>
</wfs-config>


## Example of use

To start using wiki from user,you need to include into the group

fuse and relog.
%$sudo useradd -a -G fuse user  After it you will be able to use wiki as file system. $ mount.wikipediafs /mnt/wfs/
$ls /mnt/wfs/ ru.bmstu.wiki$ cd /mnt/wfs/ru.bmstu.wiki



It's empty,because we have not accessed a single file yet.

Apply:

   $cat WikipediaFS.mw | less  Now you can see code of this page. Try to change it: $ vi WikipediaFS.mw


Page will be changed.

## Automation of work

If the wiki project is not very large, then you can pull the names of all his pages with the help of a small script:

SITE=ru.bmstu.wiki
INDEX=/w/index.php
wget -O - 'http://'$SITE$INDEX'?title=Special:Popularpages&limit=1000&offset=0' 2>/dev/null \
| grep 'li..a.href=..wiki' \
| perl -p -e 's@.*?/wiki/@@; s@".*@@' \
| perl -p -e 's/\%([A-Fa-f0-9]{2})/pack('C', hex($1))/seg;'  There are a couple of options for how to do that in the directory appeared the first 100 pages: wpfs-all-pages \ | head -100 \ | grep -v / \ | while read name do head -1 wfs/ru.bmstu.wiki/$name.mw > /dev/null
echo $name done  #!/bin/sh set -x filename=$1
cat $filename >/tmp/wpfs-perl-i-$$shift cat /tmp/wpfs-perl-i-$$ | perl -i "$@" > $filename rm /tmp/wpfs-perl-i-$$  for i in *.mw ; do wpfs-perl-i$i -p -e "s@Short link: http://ru.bmstu.wiki/wiki/([^'< \n]*)@{{short|\\$1}}@"  ; done