Hi, I do intend to create a wiki using Wiki::Toolkit . I am having problems writing nodes with versions, i mean i can create node but i am not able to update the content of the node, Can anyone here suggest me pl. Also i am short of concepts of wiki, if i am not wrong: 1. node = a unique html page dedicated to a title
Any help soon will be appreciated.
On Tue 25 Dec 2007, abhishek jain abhishek.netjain@gmail.com wrote:
I do intend to create a wiki using Wiki::Toolkit .
Excellent!
I am having problems writing nodes with versions, i mean i can create node but i am not able to update the content of the node, Can anyone here suggest me pl.
Can you show us the code you're using, please?
Also i am short of concepts of wiki, if i am not wrong:
- node = a unique html page dedicated to a title
Yes, that's right.
Kake
Thanks for the early response ,
On 12/25/07, Kake L Pugh kake@earth.li wrote:
Can you show us the code you're using, please?
I am using the code given in Synopsis and using the function directly my $written = $wiki->write_node('Home', 'abhishek struggles for this wiki.',undef,{},1 );
I was also wondering what does this moderation bit means i mean what is the concept of moderation here. Also what is checksum and its concept related to wiki here.
I am interested into the project you guys are doing is there a mean by which i can also contribute, i mean by writing some code, -- abhi
On Tue 25 Dec 2007, abhishek jain abhishek.netjain@gmail.com wrote:
I am using the code given in Synopsis and using the function directly my $written = $wiki->write_node('Home', 'abhishek struggles for this wiki.',undef,{},1 );
It really would be easier to figure out what's going on if you could send the actual code you're using. When I asked if you could show us your code, I meant actually show us the code, as in attach the script that's giving you problems. It's easier to show it than to describe it.
Having said that, my guess is that your problem is caused by not supplying the checksum, since your third argument is undef. You should only have undef there if it's the first time you've written the node. The point of the checksum is to make certain that you're not overwriting someone else's changes to the wiki page.
If you check the return value of ->write_node, this should show up - if the write fails, the return value will be false, whereas if it succeeds then the return value should be true.
If we didn't use a checksum, this would be possible:
Person A loads up a wiki page and decides to change it, so clicks on "edit this page" and starts making their changes. Person B loads up the same wiki page and decides to change it, so clicks on "edit this page" and starts making their changes. Person A finishes their edits and clicks "save". Person B finishes their edits and clicks "save".
The problem here is that Person B has never seen Person A's edits, but has saved their own edits on top of it - so Person A's edits are lost.
The way the checksum works is that when you ask $wiki for the content of the page in order to display it in the edit form, it also gives you a checksum which is unique to that version of that page. You need to put this checksum in the edit form as a hidden field, so when the user comes to save the wiki page content, it's possible to make sure that the version the user was editing is the same as the current version in the database.
So you want something like this:
my %pagedata = $wiki->retrieve_node( "My Page" ); $wiki->write_node( "My Page", "New stuff on my page", $pagedata{checksum} );
I was also wondering what does this moderation bit means i mean what is the concept of moderation here.
I think it was Nick Burch who wrote the moderation stuff, so I don't entirely understand it, but the idea is that if a node has its moderation bit set, then any changes committed to the page will only show up once an administrator has approved them. It does have some bugs though.
Also what is checksum and its concept related to wiki here.
Explained above.
I am interested into the project you guys are doing is there a mean by which i can also contribute, i mean by writing some code,
There is! Our release manager is Dominic Hargreaves, and he can tell you all about that, but I think he's one of these people who celebrate Christmas so probably won't be checking email today :)
Kake
On Wed, Dec 26, 2007 at 12:05:23AM +0000, Kake L Pugh wrote:
On Tue 25 Dec 2007, abhishek jain abhishek.netjain@gmail.com wrote:
I am interested into the project you guys are doing is there a mean by which i can also contribute, i mean by writing some code,
There is! Our release manager is Dominic Hargreaves, and he can tell you all about that, but I think he's one of these people who celebrate Christmas so probably won't be checking email today :)
Hi abhishek,
As Kake says, I'm currently responsible for new releases of Wiki::Toolkit. There's not been much development recently, but if you have any ideas for improvements (or, better, patches!) then please do email them to the mailing list or (better!) submit them to our ticketing system at
(you need to register for an account first).
Cheers,
Dominic.
Hi people, I have created a wiki based on Embperl and Wiki :: Toolkit visit it if you wish at http://delhi.pm.org/ . I was actually planning to create multiple wikis on same database based on single domain, is there a method by which with every request i can send an extra parameter as domain also, something like wordpress has one script single installation and multiple domains , also do you have a documentation on mysql tables which wiki :: toolkit creates. I mean the fields , relations and some comments, Futher has your s/w tested on large data i mean if i have at least hundreds if not more of page contents, You are doing excellent work as i created a simple wiki in days.
On 1/3/08, Dominic Hargreaves dom@earth.li wrote:
On Wed, Dec 26, 2007 at 12:05:23AM +0000, Kake L Pugh wrote: As Kake says, I'm currently responsible for new releases of Wiki::Toolkit. There's not been much development recently, but if you have any ideas for improvements (or, better, patches!) then please do email them to the mailing list or (better!) submit them to our ticketing system at
Regarding the patches am still exploring and will update as and when found. If i can be any use to you all i will be happy, i am a CPAN Author and a Perl Monger Group leader with 4+ yrs of exp. in Perl.
(you need to register for an account first).
How to register for an account.
Thanks and kind Regards, Abhishek jain
On Sat, 5 Jan 2008, abhishek jain wrote:
I was actually planning to create multiple wikis on same database based on single domain, is there a method by which with every request i can send an extra parameter as domain also, something like wordpress has one script single installation and multiple domains
I'm sure we discussed this a little while ago, but I can't seem to find it in the archives. Anyone know where that thread is? Pretty sure it was with one of the guys from goats.com
also do you have a documentation on mysql tables which wiki :: toolkit creates. I mean the fields , relations and some comments
See Wiki::Toolkit::Store::MySQL
Futher has your s/w tested on large data i mean if i have at least hundreds if not more of page contents,
Wiki::Toolkit powers openguides http://openguides.org/, many of which have very large numbers of pages
Nick
On Sat 05 Jan 2008, abhishek jain abhishek.netjain@gmail.com wrote:
I have created a wiki based on Embperl and Wiki :: Toolkit visit it if you wish at http://delhi.pm.org/ .
Shiny. I normally hate light-on-dark colour schemes, but yours is actually OK. The text is a bit small though. Also, would it be possible to make it so once I hide the announcements, they stay hidden when I navigate to different pages?
I was actually planning to create multiple wikis on same database based on single domain, is there a method by which with every request i can send an extra parameter as domain also [...]
Jody Belka was working on some changes along these lines - there was a thread back in November 2003 called "Table-prefix support". He was going to get some patches sorted out but decided to wait until I'd got the test suite in better shape first so he could write some tests for it. I think what happened was that I got the tests sorted out but never prodded him to do his part.
also do you have a documentation on mysql tables which wiki :: toolkit creates. I mean the fields , relations and some comments,
Have a look at the source of Wiki::Toolkit::Setup::MySQL - the SQL statements for table creation are all there at the top.
(Hm, I note that the MySQL setup is still missing the index on the node table.)
Futher has your s/w tested on large data i mean if i have at least hundreds if not more of page contents,
As Nick says, OpenGuides uses Wiki::Toolkit, and some of the Open Guides have thousands of pages. The Randomness Guide to London:
http://london.randomness.org.uk/
has 2695 pages at the moment, and no performance problems.
The Boston Guide has (as far as I know) tens of thousands of pages, and had some performance issues which they sorted out by using memcached. I don't know if those issues were mostly from Wiki::Toolkit, or from one of the many other CPAN modules that OpenGuides uses, or from both.
(you need to register for an account first).
How to register for an account.
Just follow the instructions on that page - i.e. go to http://dev.openguides.org/register and fill in the form there.
Kake
On 1/5/08, Kake L Pugh kake@earth.li wrote:
I was actually planning to create multiple wikis on same database based
on
single domain, is there a method by which with every request i can send
an
extra parameter as domain also [...]
Jody Belka was working on some changes along these lines - there was a thread back in November 2003 called "Table-prefix support". He was going to get some patches sorted out but decided to wait until I'd got the test suite in better shape first so he could write some tests for it. I think what happened was that I got the tests sorted out but never prodded him to do his part.
Ok so i need to get this multiple domain on 1 installation thing done and will start development, but if i start it will it be compatible with the later / future developments of Wiki::Toolkit . Also at first sight introducing a column called domainid into most if not all tables of mysql looks to be a sol. and then while intialising $wiki object pass parameter 'domainid', what do you have to say. Thanks
On Mon 07 Jan 2008, abhishek jain abhishek.netjain@gmail.com wrote:
Ok so i need to get this multiple domain on 1 installation thing done and will start development, but if i start it will it be compatible with the later / future developments of Wiki::Toolkit . Also at first sight introducing a column called domainid into most if not all tables of mysql looks to be a sol. and then while intialising $wiki object pass parameter 'domainid', what do you have to say.
Whatever the solution we choose, it needs to be backwards compatible. And of course any patches you make will need to have tests before we can accept them. I think I prefer Jody's solution of having table prefixes, since it means people with existing installations won't have to munge their databases.
What does everyone else think?
Kake
On Tue, Jan 08, 2008 at 06:55:29PM +0000, Kake L Pugh wrote:
On Mon 07 Jan 2008, abhishek jain abhishek.netjain@gmail.com wrote:
Ok so i need to get this multiple domain on 1 installation thing done and will start development, but if i start it will it be compatible with the later / future developments of Wiki::Toolkit . Also at first sight introducing a column called domainid into most if not all tables of mysql looks to be a sol. and then while intialising $wiki object pass parameter 'domainid', what do you have to say.
Whatever the solution we choose, it needs to be backwards compatible. And of course any patches you make will need to have tests before we can accept them. I think I prefer Jody's solution of having table prefixes, since it means people with existing installations won't have to munge their databases.
It's true that it'll need more munging, but since we have a neat way to upgrade between database schema versions, I don't see a huge problem with doing adding a new column to the database if it is otherwise the best solution.
Are there occasions where we would want to fetch all nodes from the database in one go? In that case, having multiple tables, one per 'domain' would just be inconvenient.
Are there any other similar considerations? Differing ACLs maybe? The column method certainly feels cleaner.
Dominic.
I'm the "goats guy" who asked about this back in the day. Although even then I had left Goats to do this webcomics hosting thing.
I ended up just using the formatting modules, and rewriting the portions of the other code I needed for myself, and then I (of course) left it all half finished, because I ended up not needing an in-house wiki for my clients. Having about a half million other things on my to-do list, this got left behind. However, I still think it'd be a nice "feature" for them, so if it gets solved, then I'm extra happy.
There are two different problems being solved here, by varying degrees, by two different solutions.
Problem 1 is integrating the wiki tables into an existing database, and not having namespace collision with existing tables.
Problem 2 is having multipe wikis across one or several sites.
The two solutions being talked about are table name prefixing, and adding a domainid column.
The name prefixing is primarily a solution to the namespace collision problem, but can also be used as a (messy) solution to multiple wikis in one database (different prefixes per wiki). It just means lots of duplicate tables in one database, which not only gives my inner-dba the twitches, but also means that if the code is upgraded, then lots of "the same" table need to be altered to work with upgraded code.
The domainid is a solution only to problem 2, and makes my inner-dba much happier.
Personally, I would like _both_ of these modifications made to the code, as then it becomes a perfect fit for me. (Multiple clients sharing one set of tables that don't conflict with anything else in my database.)
thanks, -Phillip
On Mon 07 Jan 2008, abhishek jain abhishek.netjain@gmail.com wrote:
Ok so i need to get this multiple domain on 1 installation thing done and will start development, but if i start it will it be compatible with the later / future developments of Wiki::Toolkit . Also at first sight introducing a column called domainid into most if not all tables of mysql looks to be a sol. and then while intialising $wiki object pass parameter 'domainid', what do you have to say.
Whatever the solution we choose, it needs to be backwards compatible. And of course any patches you make will need to have tests before we can accept them. I think I prefer Jody's solution of having table prefixes, since it means people with existing installations won't have to munge their databases.
What does everyone else think?
Kake
------------------------------------ - Phillip Karlsson - - http://www.dumbrellahosting.com/ - ------------------------------------
On Tue 08 Jan 2008, Phillip Karlsson phillip@dumbrellahosting.com wrote:
Problem 2 is having multipe wikis across one or several sites. [...] The domainid is a solution only to problem 2, and makes my inner-dba much happier.
Yes, that makes sense. I guess at some point we might also want to think about removing the textual content from the node table too - this would make it possible to share nodes between wikis[0].
[0] I already have a couple of use cases for this: (a) sharing content between the London Crafts wiki and the Randomness Guide to London (specifically, details of London craft shops) and (b) sharing content between the Randomness Guide to London and a more local guide to a specific borough (RGL may not want some of the very specific local content). But it is a far-away idea and not something I'm thinking of working towards any time soon.
Personally, I would like _both_ of these modifications made to the code, as then it becomes a perfect fit for me. (Multiple clients sharing one set of tables that don't conflict with anything else in my database.)
Yes, that sounds ideal!
So, given Abhishek's pressing need for multiple wikis in one database, can we say it makes sense for him to go ahead with the domainid idea, and we're happy to make sure it gets integrated into the CPAN code?
Kake
Hi all,
So, given Abhishek's pressing need for multiple wikis in one database,
can we say it makes sense for him to go ahead with the domainid idea, and we're happy to make sure it gets integrated into the CPAN code?
Yes it do make sense, i will be implementing the domainid part .
Kake
--
Thanks and kind Regards, Abhishek jain