Editing the pre-existing articles would've been absurd esp. with many more hypothetical releases in the next few years.
No, it would have been the sensible way to do this. After making the clone at 40d of course. Or do you want a reset every 'major change' release? Because lets face it the majority of the articles on there aren't likely to need much change.
Under your scrap everything logic we may as well wait for the next version before making new pages so we don't have to scrap all the hard work again...
However it would have been nice if the search defaulted to 40d info pages if the 2010 ones don't have content - which I believe would have cut down a lot of the complaints as most of them are along the lines of "I searched for X and all I got was a placeholder page".
No idea what manner of sorcery would be involved in that though.
The cheapest way would probably have been to have a redirect to the 40d version on the newly generated blank pages, would have made it hard for editors though. At this point I've have just suggest cut and pasting old data that is (at least as far as you know) correct when you come across empty pages.
You can get to the old 40d stuff from a link on the df2010 page.
However it would have been nice if the search defaulted to 40d info pages if the 2010 ones don't have content - which I believe would have cut down a lot of the complaints as most of them are along the lines of "I searched for X and all I got was a placeholder page".
No idea what manner of sorcery would be involved in that though.
1) A slow edit-by-edit change into "this is now up to date" means we don't have easily accessible and updatable information about the previous version, which some will likely play for the significant future.
Irrelevant, you can create a backup which would have matched the current 40d, in fact the name spacing was a sensible way.
We DID create a backup. It's at 40d:Article.
2) It would be impossible to look at an article that is several pages of DF2010 and 40d information mixed together and at some point declare "this is now up to date" without going through each item yourself and specifically confirming it was still accurate. This will lead to one of two scenarios: A) it will be a long time before we have pages that we can rely on, B) eventually people will just start from scratch.
2a) It's a wiki, it will never be accurate but working from existing data will result in faster page changes. Furthermore a lot (most?) of the old information is still valid as is.
2b) I don't see this is a downside at all? but under the current system you forced that.
I wish I noticed the discussion on going before these changes were made as it's fairly easy to point how how flawed the final choice is. In any case I'm sure it won't be too long till some of the pages have data again.
[/quote]
We CAN work from existing data, in fact on most pages that data is at a link prominently displayed right at the top. Because most but not all information is up to date you can use the old information to add to the new articles after you've checked it.
We forced it because if it was gradual then there's noway to know if every part of an article has been verified for the new version unless you did it yourself. In this way it can be added to the new article piece by piece and we can know that DF2010: stuff IS accurate because someone has checked it.
I wish you had known to, but then again these statements were addressed then as well.
However it would have been nice if the search defaulted to 40d info pages if the 2010 ones don't have content - which I believe would have cut down a lot of the complaints as most of them are along the lines of "I searched for X and all I got was a placeholder page".
No idea what manner of sorcery would be involved in that though.
Sure, that would be neat, but it's not an editor-level change, something like that if possible would have to be hacked into an already unreliable wiki system. But then I think it's likely those placeholder pages wouldn't have been filled in as quickly as they are.
However it would have been nice if the search defaulted to 40d info pages if the 2010 ones don't have content - which I believe would have cut down a lot of the complaints as most of them are along the lines of "I searched for X and all I got was a placeholder page".
No idea what manner of sorcery would be involved in that though.
The cheapest way would probably have been to have a redirect to the 40d version on the newly generated blank pages, would have made it hard for editors though. At this point I've have just suggest cut and pasting old data that is (at least as far as you know) correct when you come across empty pages.
You can get to the old 40d stuff from a link on the df2010 page.
"at least as far as you know" isn't exactly what we're looking for. DF2010 pages are supposed to not be stuff that MIGHT be accurate as of this version but stuff you can be SURE is because someone actually checked it instead of copy-paste a 4 page article. This is the best way to accomplish that.
I know, but it's a small link that many people apparently have missed and in some, seeing the stub page seems to induce the sort of rage that many people get when they see "under construction" pages on websites.
Valid criticism: the links to 40d versions should be more prominent (especially if the page is short?).
Agreed.
Editing the pre-existing articles would've been absurd esp. with many more hypothetical releases in the next few years.
No, it would have been the sensible way to do this. After making the clone at 40d of course. Or do you want a reset every 'major change' release? Because lets face it the majority of the articles on there aren't likely to need much change.
Under your scrap everything logic we may as well wait for the next version before making new pages so we don't have to scrap all the hard work again...
We didn't "reset" we didn't "scrap" things.
Nothing is gone. Many of the articles are
extensive and so MOST will have some kind of change. A discussion came to the conclusion that "almost accurate" and "probably right" isn't exactly our goal, and that's the best you can get without requiring newly written articles. We saw the concern about scrapping things so the solution was to have an old version accurate as of that version, and a new version accurate as of the new version.
Under your scrap everything logic
I'm not sure where you got 'scrap everything' from out of my post, but using extremes like that is a ridiculous way to debate an issue.
Both points of view involve having two versions of each page, one for each game version (plus 23a where applicable). Your view does not use namespaces and would instead tack on a "this information has not been checked to be currently accurate" template on while storing the original. The way it was executed does basically the same thing but from the other direction: Move the original stuff, then copy it back in once it had been check over so as to not confuse new players with out of date information; and it uses namespaces for current articles rather than waiting for the next version. There is rather little difference; I don't see how one can be terrible while the other sensible when they're so similar.
The legacy information has not been scrapped in the slightest. It is simply one single click further away. Main links like url.com/Creatures now go to current version redirects, so whenever the next major version change comes out the non-namespace links will be just as effective. Namespaces will be required for each and every major version that is released. 99% of pages will need one copy per version anyhow. There is little difference between leaving it for now or for later as we have redirects. Really I see both methods as equally effective and with equal flaws.
I slightly disagree, the way it was done is better then the way he is proposing. The fact that he keeps stating things were "scrapped" when they weren't doesn't mean there must be equivocating on our end about the discussion. The plan was great. The execution was fantastic given limited (volunteer) manpower. The outcome is very useful as far as I can tell. Other methods could have worked but not as well imo.
Edit - ARGH quotes messed up. Sigh - I have to go to work