git.vger.kernel.org archive mirror
 help / color / mirror / Atom feed
From: Simon Cathebras <simon.cathebras@ensimag.imag.fr>
To: git@vger.kernel.org
Cc: Matthieu.Moy@imag.fr, simon.cathebras@ensimag.imag.fr,
	charles.roussel@ensimag.imag.fr, Guillaume.Sasdy@ensimag.imag.fr,
	Julien.Khayat@ensimag.imag.fr, Simon.Perrat@ensimag.imag.fr,
	peff@peff.net, gitster@pobox.com,
	Guillaume Sasdy <guillaume.sasdy@ensimag.imag.fr>,
	Julien Khayat <julien.khayat@ensimag.imag.fr>,
	Simon Perrat <simon.perrat@ensimag.imag.fr>,
	Matthieu Moy <matthieu.moy@imag.fr>
Subject: [PATCH 2/3] Test environment of git-remote-mw
Date: Tue,  5 Jun 2012 15:25:55 +0200	[thread overview]
Message-ID: <1338902756-4162-2-git-send-email-simon.cathebras@ensimag.imag.fr> (raw)
In-Reply-To: <1338902756-4162-1-git-send-email-simon.cathebras@ensimag.imag.fr>

From: Guillaume Sasdy <guillaume.sasdy@ensimag.imag.fr>

In order to test git-remote-mediawiki, we need a package of functions
to manage a MediaWiki: edit a page, remove a page, fetch a page,
fetch all pages on a given wiki.

We also need functions to compare the content of directories.

Signed-off-by: Simon Cathebras <simon.cathebras@ensimag.imag.fr>
Signed-off-by: Guillaume Sasdy <guillaume.sasdy@ensimag.imag.fr>
Signed-off-by: Julien Khayat <julien.khayat@ensimag.imag.fr>
Signed-off-by: Simon Perrat <simon.perrat@ensimag.imag.fr>
Signed-off-by: Charles Roussel <charles.roussel@ensimag.imag.fr>
Signed-off-by: Matthieu Moy <matthieu.moy@imag.fr>
---
 t/test-gitmw-lib.sh | 105 +++++++++++++++++++++++++++++++
 t/test-gitmw.pl     | 176 ++++++++++++++++++++++++++++++++++++++++++++++++++++
 2 files changed, 281 insertions(+)
 create mode 100755 t/test-gitmw.pl

diff --git a/t/test-gitmw-lib.sh b/t/test-gitmw-lib.sh
index cebef24..cb5ba19 100755
--- a/t/test-gitmw-lib.sh
+++ b/t/test-gitmw-lib.sh
@@ -26,6 +26,111 @@ DB_INSTALL_SCRIPT="db_install.php"
 WIKI_ADMIN="WikiAdmin"
 WIKI_PASSW="AdminPass"
 
+wiki_getpage () {
+	$GIT_BUILD_DIR/t/test-gitmw.pl get_page "$@"
+}
+
+wiki_delete_page () {
+	$GIT_BUILD_DIR/t/test-gitmw.pl delete_page "$@"
+}
+
+wiki_editpage () {
+	$GIT_BUILD_DIR/t/test-gitmw.pl edit_page "$@"
+}
+
+die () {
+	die_with_status 1 "$@"
+}
+
+die_with_status () {
+	status=$1
+	shift
+	echo >&2 "$*"
+	exit "$status"
+}
+
+# git_diff_directories <dir_git> <dir_wiki>
+#
+# Compare the contents of directories <dir_git> and <dir_wiki> with diff
+# and dies if they do not match. The program will
+# not look into .git in the process.
+# Warning: the first argument MUST be the directory containing the git data
+git_diff_directories () {
+	mkdir -p "$1_tmp"
+	cp "$1"/*.mw "$1_tmp"
+
+	diff -r -b "$1_tmp" "$2"
+
+	if test $? -ne 0
+	then
+		rm -rf "$1_tmp"
+		die "test failed: directories $1 and $2 do not match"
+	fi
+	rm -rf "$1_tmp"
+}
+
+
+# wiki_check_content <file_name> <page_name> 
+#
+# Compares the contents of the file <file_name> and the wiki page
+# <page_name> and exits with error 1 if they do not match.
+wiki_check_content () {
+	mkdir -p wiki_tmp
+	wiki_getpage "$2" wiki_tmp
+	diff -b "$1" wiki_tmp/"$2".mw
+	if test $? -ne 0
+	then
+		rm -rf wiki_tmp
+		die "ERROR: file $2 not found on wiki"
+	fi
+	rm -rf wiki_tmp
+}
+
+# wiki_page_exist <page_name>
+#
+# Check the existence of the page <page_name> on the wiki and exits
+# with error if it is absent from it.
+wiki_page_exist () {
+	wiki_getpage "$1" .
+
+	if test -f "$1".mw ; then
+		rm "$1".mw
+	else
+		die "test failed: file $1 not found on wiki"
+	fi
+}
+
+# wiki_getallpagename
+# 
+# Fetch the name of each page on the wiki.
+wiki_getallpagename () {
+	$GIT_BUILD_DIR/t/test-gitmw.pl getallpagename
+}
+
+# wiki_getallpagecategory <category>
+# 
+# Fetch the name of each page belonging to <category> on the wiki.
+wiki_getallpagecategory () {
+	$GIT_BUILD_DIR/t/test-gitmw.pl getallpagename "$@"
+}
+
+# wiki_getallpage <dest_dir> [<category>]
+#
+# Fetch all the pages from the wiki and place them in the directory
+# <dest_dir>.
+# If <category> is define, then wiki_getallpage fetch the pages included
+# in <category>.
+wiki_getallpage () {
+	if test -z "$2";
+	then
+		wiki_getallpagename
+	else
+		wiki_getallpagecategory "$2"
+	fi
+	mkdir -p "$1"
+	while read -r line; do
+		wiki_getpage "$line" $1;
+	done < all.txt
 }
 
 # Create the SQLite database of the MediaWiki. If the database file already
diff --git a/t/test-gitmw.pl b/t/test-gitmw.pl
new file mode 100755
index 0000000..ff7f63e
--- /dev/null
+++ b/t/test-gitmw.pl
@@ -0,0 +1,176 @@
+#!/usr/bin/perl -w
+# Copyright (C) 2012
+#     Charles Roussel <charles.roussel@ensimag.imag.fr>
+#     Simon Cathebras <simon.cathebras@ensimag.imag.fr>
+#     Julien Khayat <julien.khayat@ensimag.imag.fr>
+#     Guillaume Sasdy <guillaume.sasdy@ensimag.imag.fr>
+#     Simon Perrat <simon.perrat@ensimag.imag.fr>
+# License: GPL v2 or later
+
+# Usage:
+#       ./test-gitmw.pl <command> [argument]*
+# Execute in terminal using the name of the function to call as first
+# parameter, and the function's arguments as following parameters
+#
+# Example:
+#     ./test-gitmw.pl "get_page" foo .
+# will call <wiki_getpage> with arguments <foo> and <.>
+#
+# Available functions are:
+#     "get_page"
+#     "delete_page"
+#     "edit_page"
+#     "getallpagename"
+
+use MediaWiki::API;
+use Switch;
+# URL of the wiki used for the tests
+my $wiki_url="http://localhost/wiki/api.php";
+my $wiki_admin='WikiAdmin';
+my $wiki_admin_pass='AdminPass';
+my $mw = MediaWiki::API->new;
+$mw->{config}->{api_url} = $wiki_url;
+
+# wiki_login <name> <password>
+#
+# Logs the user with <name> and <password> in the global variable
+# of the mediawiki $mw
+sub wiki_login {
+	$mw->login( { lgname => "$_[0]",lgpassword => "$_[1]" } )
+                || die "getpage: login failed";
+}
+
+# wiki_getpage <wiki_page> <dest_path>
+#
+# fetch a page <wiki_page> from the wiki referenced in the global variable
+# $mw and copies its content in directory dest_path
+sub wiki_getpage {
+	my $pagename = $_[0];
+	my $destdir = $_[1];
+	
+	my $page = $mw->get_page( { title => $pagename } );
+	if (!defined($page)) {
+		die "getpage: wiki does not exist";
+	}
+
+	my $content = $page->{'*'};
+	if (!defined($content)) {
+		die "getpage: page does not exist";
+	}
+
+	# Replace spaces by underscore in the page name
+	$pagename=$page->{'title'};
+	$pagename=~s/\ /_/;
+	open(my $file, ">$destdir/$pagename.mw");
+	print $file "$content";
+	close ($file);
+}
+
+# wiki_delete_page <page_name>
+#
+# delete the page with name <page_name> from the wiki referenced
+# in the global variable $mw
+sub wiki_delete_page {
+	my $pagename = $_[0];
+
+	my $exist=$mw->get_page({title => $pagename});
+
+	if (defined($exist->{'*'})){
+		$mw->edit({ action => 'delete',
+			        title => $pagename})
+		|| die $mw->{error}->{code} . ": " . $mw->{error}->{details};
+	} else {
+		die "no page with such name found: $pagename\n";
+	}
+}
+
+# wiki_editpage <wiki_page> <wiki_content> <wiki_append> [<category>]
+#
+# Edit a page named <wiki_page> with content <wiki_content> on the wiki
+# referenced with the global variable $mw
+# If <wiki_append> == true : append <wiki_content> at the end of the actual
+# content of the page <wiki_page>
+# If <wik_page> doesn't exist, that page is created with the <wiki_content>
+sub wiki_editpage {
+	my $wiki_page = $_[0];
+	my $wiki_content = $_[1];
+	my $wiki_append = $_[2];
+
+
+	my $append = 0;
+	if (defined($wiki_append) && $wiki_append eq 'true') {
+		$append=1;
+	}
+
+	my $previous_text ="";
+
+	if ($append) {
+	my $ref = $mw->get_page( { title => $wiki_page } );
+		$previous_text = $ref->{'*'};
+	}
+
+	my $text = $wiki_content;
+	if (defined($previous_text)) {
+		$text="$previous_text$text";
+	}
+	
+	# Eventually, add this page to a category.
+	if (defined($_[3])) {
+		my $category_name="[[Category:$_[3]]]";
+		$text="$text\n $category_name"
+	}
+	$mw->edit( { action => 'edit', title => $wiki_page, text => "$text"} );
+}
+
+
+# wiki_getallpagename [<category>]
+#
+# Fetch all pages of the wiki referenced by the global variable $mw
+# and print the names of each one in the file all.txt with a new line
+# ("\n") between these.
+# If the argument <category> is defined, then this function get only the pages 
+# belonging to <category>.
+sub wiki_getallpagename {
+	# fetch the pages of the wiki
+	if (defined($_[0])) {
+		$mw->list ( { action => 'query',
+				list => 'categorymembers',
+				cmtitle => "Category:$_[0]",
+				cmnamespace => 0,
+				cmlimit=> 500 },
+			{ max => 4, hook => \&cat_names } )
+			|| die $mw->{error}->{code}.": ".$mw->{error}->{details};
+	} else {
+		$mw->list ( { action => 'query',
+				list => 'allpages',
+				#cmnamespace => 0,
+				cmlimit=> 500 },
+			{ max => 4, hook => \&cat_names } )
+			|| die $mw->{error}->{code}.": ".$mw->{error}->{details};
+	}
+	# print the name of each page
+	sub cat_names {
+		my ($ref) = @_;
+
+		open(my $file, ">all.txt");
+		foreach (@$ref) {
+			print $file "$_->{title}\n";
+		}
+		close ($file);
+	}
+}
+
+# Main part of this script: parse the command line arguments
+# and select which function to execute
+my $fct_to_call = shift;
+
+	&wiki_login($wiki_admin,$wiki_admin_pass);
+
+switch ($fct_to_call) {
+	case "get_page" { &wiki_getpage(@ARGV)}
+	case "delete_page" { &wiki_delete_page(@ARGV)}
+	case "edit_page" { &wiki_editpage(@ARGV)}
+	case "getallpagename" { &wiki_getallpagename(@ARGV)}
+	else { die("test-gitmw.pl ERROR: wrong argument")}
+}
+
-- 
1.7.10.2.552.gaa3bb87

  reply	other threads:[~2012-06-05 13:26 UTC|newest]

Thread overview: 21+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2012-06-05 13:20 [Git-MediaWiki] Test environment for Git-MediaWiki Simon.Cathebras
2012-06-05 13:25 ` [PATCH 1/3] Script to install, delete and clear a MediaWiki Simon Cathebras
2012-06-05 13:25   ` Simon Cathebras [this message]
2012-06-05 13:25   ` [PATCH 3/3] Tests file for git-remote-mediawiki Simon Cathebras
2012-06-06 20:18     ` Matthieu Moy
2012-06-08  8:04       ` Simon.Cathebras
2012-06-08  8:57         ` Matthieu Moy
2012-06-08  9:04           ` Simon Perrat
2012-06-08  9:08             ` Matthieu Moy
2012-06-05 16:48   ` [PATCH 1/3] Script to install, delete and clear a MediaWiki Junio C Hamano
2012-06-06 13:49     ` Simon.Cathebras
  -- strict thread matches above, loose matches on Subject: below --
2012-05-30 16:30 [PATCH/RFC]Test environment for Git-MediaWiki Simon.Cathebras
2012-05-30 17:04 ` [PATCH 1/3] Script to install, delete and clear a MediaWiki Simon Cathebras
2012-05-30 17:04   ` [PATCH 2/3] Test environment of git-remote-mw Simon Cathebras
2012-05-31  7:17     ` Matthieu Moy
2012-06-01  8:53       ` Simon.Cathebras
2012-06-01  9:02         ` Matthieu Moy
2012-06-01  9:21           ` Matthieu Moy
2012-06-01 10:41 ` [PATCH 1/3] Script to install, delete and clear a MediaWiki Guillaume Sasdy
2012-06-01 10:41   ` [PATCH 2/3] Test environment of git-remote-mw Guillaume Sasdy
2012-06-01 11:49     ` Matthieu Moy
2012-06-01 14:43       ` Simon.Cathebras
2012-06-02 10:47         ` Matthieu Moy
2012-06-04 14:13           ` Simon.Cathebras

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=1338902756-4162-2-git-send-email-simon.cathebras@ensimag.imag.fr \
    --to=simon.cathebras@ensimag.imag.fr \
    --cc=Guillaume.Sasdy@ensimag.imag.fr \
    --cc=Julien.Khayat@ensimag.imag.fr \
    --cc=Matthieu.Moy@imag.fr \
    --cc=Simon.Perrat@ensimag.imag.fr \
    --cc=charles.roussel@ensimag.imag.fr \
    --cc=git@vger.kernel.org \
    --cc=gitster@pobox.com \
    --cc=peff@peff.net \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).