Hacker Newsnew | past | comments | ask | show | jobs | submit | leejo's commentslogin

> Why is the deletion (or even deletion proposal) regarded as such a heinous act that people feel the need to attack and bully others?

FWIW I don't see this as an attack (with, perhaps, the exception of a couple of comments in the linked thread) and posted the link to the reddit thread as I see it more as an interesting observation around the myriad issues facing "legacy" languages and communities. To wit:

* Google appears to be canon for finding secondary sources, according to the various arguments in the deletion proposals, yet we're all aware of how abysmal Google's search has been for a while now.

* What's the future of this policy given the fractured nature of the web these days, walled gardens, and now LLMs?

* An article's history appears to be irrelevant in the deletion discussion: the CPAN page (now kept) had 24 years of history on Wikipedia, with dozens of sources, yet was nominated for deletion.

* Link rot is pervasive, we all knew this, but just how much of Wikipedia is being held up by the waybackmachine?

* Doesn't this become a negative feedback cycle? Few sources exist, therefore we remove sources, therefore fewer sources exist.


> Google appears to be canon for finding secondary sources, according to the various arguments in the deletion proposals, yet we're all aware of how abysmal Google's search has been for a while now.

Nobody is forcing you to use Google. If you can provide an acceptable source without the help of Google, go ahead. But the burden of proof is on the one who claims sources exist.

> An article's history appears to be irrelevant in the deletion discussion: the CPAN page (now kept) had 24 years of history on Wikipedia, with dozens of sources, yet was nominated for deletion.

Such is life when anyone can nominate anything at any moment... and when many articles that should have never been submitted in the first place slip through cracks of haphazard volunteer quality control. (Stack Overflow also suffers from the latter.)

The sources is the only part that matters. And they sufficed to keep the CPAN article on site, so the system works.

> Doesn't this become a negative feedback cycle? Few sources exist, therefore we remove sources, therefore fewer sources exist.

It was wrong to submit the article without sourcing in the first place. Circular sourcing is not allowed.


> The sources is the only part that matters. And they sufficed to keep the CPAN article on site, so the system works.

The system works if the sources remain available, and in an environment predisposed to link rot that can be a problem. Imagine the hypothetical situation of archive.org disappearing overnight? Should we then delete all pages with it as their sole source if they're not updated within a week?

And the system works if intentions are pure - it seems here the user that suggested the deletion of several Perl related pages is a fan of film festivals[1] and clearly wasn't happy that the "White Camel Award" is a Perl award, since the late 90s, and not a film festival award (since the early 00s). At least according to Google. So they went on a bit of a rampage against Perl articles on Wikipedia.

You could argue "editor doing their job", but I would argue "conflict of interest".

[1]: https://en.wikipedia.org/w/index.php?title=Sahara_Internatio... # amongst many in their edit history


These are all bad-faith takes. What are you doing?

24 years ago, some people wrote on Wikipedia instead of elsewhere. So the wiki page itself became a primary source.

"The page shouldn't have been submitted..." This was a Wiki! If you're unfamiliar with the origin of the term, it was a site mechanism designed to lean in to quick capture and interweaving of documents. Volunteers wrote; the organization of the text arose through thousands of hands shaping it. Most of them were software developers at the time. At a minimum, the software-oriented pages should get special treatment for that alone.

You're acting as though this is producing the next edition of Encyclopedia Britannica, held to a pale imitation of its standards circa the 1980s. The thing is, Britannica employed people to go do research for its articles.

Wikipedia is not Britannica, and this retroactive "shame on them" is unbelievable nonsense.


Verifiability is a core policy on Wikipedia, and with time, citing your sources has become more and more important. Wikipedia isn't was it once was in 2001. Articles can't survive on being verified by their own primary sources, for the same reason we don't want Wikipedia to become a dumping ground for advertisers who then cite their own site in an attempt to gain legitimacy. Secondary sources provide a solid ground truth that the subject in question has gained recognition and thus notability. If those secondary sources don't exist, we can't assume notability based on nothing.

Wikipedia isn't Britannica, because by this point it's probably a lot better than Britannica. They were comparable already in 2005,[1] and I have little reason to believe that Wikipedia is doing much worse on that front nowadays, even though they have vastly more content than Britannica.

[1] https://www.cnet.com/tech/tech-industry/study-wikipedia-as-a...


Some of the deleted pages never had the « sources missing » tag set for a significative time. It has been straight to deletion point.

Some pages that survived the deletion (e.g. TPRF) had the « missing sources » tag set since 15 years… What, I have to admit, can justify some action. But it was not the case for the PerlMonks and Perl Mongers pages: those just got deleted on an extremely short notice, making it impossible for the community to attempt any improvement.


7 days is policy for a deletion proposal,[1] which I can agree is not really enough time, although it's usually extended if talks are still ongoing.

There aren't really any rules about putting up notices and such before proposing deletion, and if you can't find anything other than primary sources, it doesn't seem unreasonable to propose deletion than propose a fix which can't be implemented. Thankfully, someone did find reliable sources for some of the articles.

[1] https://en.wikipedia.org/wiki/Wikipedia:Deletion_policy#Prop...


> If you can provide an acceptable source....

https://arstechnica.com/gadgets/2021/08/the-perl-foundation-...

https://www.theregister.com/2021/04/13/perl_dev_quits/

20 seconds.

If I ran Wikipedia I would ban everyone involved in this spectacle.


> And they sufficed to keep the CPAN article on site, so the system works.

This is such an absurd take. “It this one example the system worked so clearly it’s fine.”


The CPAN page on Wikipedia has existed for 24 years, has dozens of references, yet an editor nominated it for deletion - I can't help but feel that as hostile. Fortunately this one has been voted "keep", but still...


The person who nominated it for deletion changed their opinion after suitable sources were found, and the article was thus kept within a day. That hardly seems hostile to me, but rather just someone trying to uphold Wikipedia's sourcing and notability requirements.


I'm sorry, but I just don't believe that. As stated below in several other comments, none of this makes sense and the Wikipedia editors hiding behind "this is the policy, you do not get to question it" stinks.

The original user withdrew their deletion suggestion and added the "This article relies excessively on references to primary sources. Please improve this article by adding secondary or tertiary sources." banner, sure. Why didn't they just do that in the first place?

Instead they looked at an article that had existed for twenty years, with a comprehensive history of changes, had lots of information, links, and [albeit primary] sources; they did some cursory Googling, then suggested it for deletion - with a deadline of 7 days: https://en.wikipedia.org/w/index.php?title=CPAN&diff=1327587...

Wikipedia literally has its own page to suggest that you don't do that: https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_fence...

Wikipedia's own policies around deletion mean it's easy to delete articles you don't particularly like - if they are old enough they probably lack secondary sources. You can't inform users who would be able to contribute off-Wikipedia: https://en.wikipedia.org/wiki/Wikipedia:Canvassing#Stealth_c... which means it's unlikely they will be updated before the deadline passes. Many of these articles were contributed by people who have long moved on, and few of us are paying attention to every possible thing on Wikipedia. Twenty years of history deleted in a week. That's wrong.

This feels like the actions of a newly promoted editor, inexperienced, and eager to start "cleaning up" Wikipedia, which it is damaging. It also feels like the actions of an editor who, when editing another article, saw that the thing they were adding didn't point to what they expected on Wikipedia: https://en.wikipedia.org/w/index.php?title=White_Camel_award... # instead of adding a page to disambiguate, they decided to go on a crusade to purge articles that had existed for twenty years. And because these were mostly articles that predate Wikipedia's sourcing policies, they knew it was likely they would succeed.

As I've stated in one of the talk threads: https://en.wikipedia.org/wiki/User_talk:Left_guide#c-Leejeba... # I'm not particularly concerned about the restoration of some of the articles, instead I'm more concerned about the blunt application of policies that means important reference, history, and culture are being deleted.


> Why didn't they just do that in the first place?

Because they didn't find any. If secondary sources don't seem to exist, then there's no hope to begin with. Someone else found them, and that instantly made it clear that it's fixable.

> they did some cursory Googling, then suggested it for deletion - with a deadline of 7 days:

I fail to see the problem here, unless you have a problem with only getting 7 days. That's policy as far as I know,[1] and it would be nice to at least get two weeks, but I can't blame the individual proposer.

[1] https://en.wikipedia.org/wiki/Wikipedia:Deletion_policy#Prop...

Chesterton's fence goes both ways. Wikipedia policies are there for non-obvious reasons in some cases.

> You can't inform users who would be able to contribute off-Wikipedia:

Canvassing is one thing. Productively finding sources for an article is another. Wikipedia has had many a talk page drown in off-wiki people come in and make an account to prevent some thing from happening, none of them understanding why the thing is happening, none of them understanding Wikipedia policy, and none of them caring. Inviting these people to discussions is not productive and just a bad time for everyone.

If you invite one or two people to actually improve the article, so it can survive on it's own merit, I can't imagine anyone having a problem with that. The equivalent on-wiki thing of just pinging relevant editors is common and encouraged.

> This feels like the actions of a newly promoted editor, inexperienced, and eager to start "cleaning up" Wikipedia,

The user in question has no special user rights (they were automatically updated to an extended confirmed user over a year ago), and has a few decently long articles under their belt.

> It also feels like the actions of an editor who, when editing another article, saw that the thing they were adding didn't point to what they expected on Wikipedia:

> instead of adding a page to disambiguate, they decided to go on a crusade to purge articles that had existed for twenty years. And because these were mostly articles that predate Wikipedia's sourcing policies, they knew it was likely they would succeed.

This is a rather unfavourable view of the situation, and not really one made in good faith. I can agree that that article shouldn't have just been turned into a redirect, but that redirect was made by a different user to the one who's been nominating articles for deletion, and I can't see any obvious connection.

Articles being older than the sourcing requirements also do not exclude them from those requirements. They usually get a break because of that, but it's been well over 10 years by this point.

> As I've stated in one of the talk threads: https://en.wikipedia.org/wiki/User_talk:Left_guide#c-Leejeba... # I'm not particularly concerned about the restoration of some of the articles, instead I'm more concerned about the blunt application of policies that means important reference, history, and culture are being deleted.

Owen who responded to your post makes a good point. I'd argue that if Wikipedia deleting an article about something amounts to the deletion or destruction of history, reference or culture, then thing in question probably has some notability problems. Wikipedia makes it easier to find information about a particular topic, but can't be the only reasonable source of things. There has to at least have been reliable sources for the article to have been based on, even if they aren't easily available at this point.


You're basically reinforcing my arguments - these are the policies, deal with it.

I believe the 7 day deadline to avoid the deletion of 20+ years of history is destructive because most of the people that would be notified of this have long since moved on, no longer care, or are off Wikipedia.

The cursory Googling by those who have the power to delete is also concerning. As stated elsewhere in this discussion, Google hasn't been great for search for a long time.


The policies are there for good reason most of the time, and rarely without there having been a lot of talk about what said policy should be. I found them very helpful during my time editing, since they accurately reflect what happens and why, with the whole process being transparent. Maybe I'm just biased.

Google isn't the end-all-be-all of sourcing, as has been shown by the articles that have been kept. If you can find reliable sources, it will be kept. Google is just the final nail in the coffin.


There it is, right? Seven days and twenty years, gone. To quote, it is "the slow decline, the emptying out, and the long, long process of forgetting".

Wikipedia's deletion proposals are the online equivalent of putting a small poster on a village noticeboard and being surprised that the entire world doesn't see it.

It's disgraceful.


It isn't Wikipedia's job nor mission to remember. The Internet Archive took on that mission. Hence why you can still find the article there. The article isn't gone. It's a bit less accessible. I love them both, but they work in very different ways.


I sell photographic prints. A breakdown of income and costs for this year can be found here: https://leejo.github.io/2025/11/01/print_costs/

TL;DR? It's a grind, an absolute grind.


My Xpan is now over 25 years old, and I've been doing stuff like this with it for over a decade: https://www.youtube.com/watch?v=SIy2_IpEw8c # electronics are still holding strong... for now. They tend to have more mechanical problems than electrical problems in my experience. But yes, I certainly wouldn't spend anything like what they are going for these days.

Albert (the subject of the original article here) is a former colleague and I recently visited him at home where he showed me his studio and the cameras he'd been creating. All very cool stuff.


I may blog about this next year, again[^1], as I'm working on a project that sort of covers it - not in a way that will answer the question but more observational.

Anyway, I feel Perl's popularity was hugely exaggerated in the mid to late 90s and early 00s. The alternatives were either not there in terms of language and toolchain features, ease of use, "whipuptitude" or whatever, or library support (CPAN was a killer app), or they were too old school or enterprisey. Sysadmins were using it everywhere so it got into all sorts of systems that other languages couldn't without much more faff.

Its back compatibility meant it stayed in those places for a long time. It's still in a lot of those places.

The fall in popularity the last decade or two was more of a regression to the mean, or perhaps below the mean. Many other languages have come along, which have contributed even more to the fall in share.

Yes, yes, Raku (né Perl 6) but I'd argue that also contributed to a lot of really good stuff on CPAN. The Perl 5 core did get neglected for a number of years, as @autarch says, which may have been a factor.

[^1] previously: https://leejo.github.io/2017/12/17/tpc_and_the_end_of_langua...


> Thats a good question. I wonder if the "virtual drum" was there to get over film holding issues (as in it physically bends the film) or that its a line scanner

Both.

> personally I think that technology has come on enough to move on from the imacon/hasselblad: https://emulsive.org/articles/opinion/scanning-film-the-20k-...

It's not - the issue that still remains is keeping the film flat, and this is especially problematic with smaller formats. With current solutions you can get the resolution but not the flatness, or you sacrifice something to get the flatness (e.g. ANR glass holders). It's the old glass vs glassless carrier debate, applied to a modern workflow.

I repeat myself: focus, DPI / resolution, dynamic range - these are the solved problems. In fact, modern medium format digital cameras are superior on all these factor. Keeping the film flat, however? Only drum scans and the Imacon "Flextight" solution do this well.

Of course, it depends on what you plan to do with the scans and for 99% of people the solution in the link above is more than good enough.

I've written about this previously https://leejo.github.io/tags/scanning/ # I'm going to add the fourth, and hopefully final, part in a couple of months time.



Which is more believable or surprising:

a) there is probably several orders of magnitude more Perl code running out there in the wild than Rust?

b) the TIOBE index was ever meaningful?


I'm sure (a) is true. But TIOBE is supposed to be some balance between "total mass of existing code" and "current interest in new code".


The "d" in 3d means "domain", so three domains: the merchant, the card issuing bank, and the card scheme(s). The first two have to opt-in to the process for it to be enabled, and most (all?) card issuing banks already have so it's down to the merchant.

Not all merchants will opt-in to 3d Secure as they might see a greater loss in revenue due to the friction it creates versus the risk. They might be taking payments in a low risk sector and use other fraud checking factors, or it might not make sense for them - examples where you end up having to produce the same card in person anyway so "card not present" fraud doesn't factor in so much.

Some merchants don't opt-in as it would lose them millions of dollars of payments an hour due to the friction: Amazon for example.

I worked on the 3d Secure (and, formally, "Verified by Visa") integration at my previous job, and for a long time I was thinking I should write a blog post on what a complete mess of a protocol and implementation it [still] is. Haven't ever gotten around to that though.


> on what a complete mess of a protocol and implementation it [still] is

Banks are banks :)

> so it's down to the merchant

... or down to the implementation team that may not even have mentioned it to the merchant if said merchant is in an area used to insecure credit card payments ...

Opting out is still customer hostile if you ask me.


> Opting out is still customer hostile if you ask me.

That's debatable - I really dislike my own card issuer's implementation as they will ring me, rather than prompt for a OTP, which is a long process and not always convenient. Other card issuers have other implementations. That's one of the, er, issues with the protocol - a lack of consistency. There are many other problems with it.

I'm using this with a credit card, and that already has strong consumer protections if fraud should happen. I, as the consumer, do not get to opt-out of this poorly implemented protocol.

Merchants are sold the protocol with the argument that it reduces chargebacks, i.e. reduces their costs, not that it is good for their consumers. If I (or someone else) makes a payment with my card, and it passes the 3d Secure process, then the chargeback option is a liability that it taken by the issuing bank - and they shift that liability further by passing it on to the card holder: "This transaction when through 3d Secure, your charge back option for it is revoked".

That's hostile to the customer.

Like I said, I have a tonne of material for a blog post. I just need to be bothered to write it.


Chargebacks are extra work for the consumer too you know.

If we're philosophising, wouldn't it be better to have a honest system where the user authorizes all charges and the merchant doesn't get to auto renew subscriptions without user input just because they feel like it?


> Chargebacks are extra work for the consumer too you know.

It's not about work it's about the burden of cost due to fraud not being passed on to a consumer such that it could put them in financial difficulty. Chargebacks are there to protect the consumer and not the merchant - The 3d Secure "liability shift" (they literally call it this in the spec) flips that arrangement. Merchants are compelled to reduce their chargeback levels as they have to pay for each chargeback case, and should it become frequent their ability to process payments will be revoked.

Just turn on 3d Secure and your merchant chargeback costs reduce significantly. Nice? Not for the consumer. But I repeat myself.

> If we're philosophising, wouldn't it be better to have a honest system where the user authorizes all charges and the merchant doesn't get to auto renew subscriptions without user input just because they feel like it?

Merchants probably should notify their users with subscriptions, sure - I got one a couple of months ago from F1TV that my subscription will renew and maybe I don't want that subscription any more, or perhaps I want to change the level of my subscription. Other merchants won't be as nice, and dark patterns will creep in. Some companies have business models built on these recurring subscriptions.

I can't recall the rules around these, but I can recall that there are (were, we're going back 12 years here) systems in place to reduce issues for recurring payments. Even when a cardholder's details are updated, including replacement of a card and its PAN[1]. Any subscriptions would be retained to avoid interruption to the consumer's subscription, which might be critical for them (the consumer).

[1] https://en.wikipedia.org/wiki/Payment_card_number


> Any subscriptions would be retained to avoid interruption to the consumer's subscription, which might be critical for them (the consumer).

Sorry, that's complete and utter bullshit. Even if you think you're defending the customer's position, everything you said is in the vendor's favour. It "reduces friction" but only when it's in their interest.


> Sorry, that's complete and utter bullshit.

I'm saying that's how it is, not how it should be.


Please write that blog post if you can! It's such an interesting part of the industry imo but there's basically ~ public documentation or discussions about it.


I may do so, eventually.

Related - I gave a talk a couple of weeks ago about banking interchange formats, which is related to all of this. The slides are here (top one) and the recording of the talk (which I will link) should appear soon: https://leejo.github.io/code/


> My experience with Perl is that often the batteries are rotting.

I think the batteries metaphor was meant to refer to the std lib, or (for Perl) the "CPAN" modules that are part of the core. The Perl core always keeps those batteries charged because they can't do a release if any of those are dead. They even ejected problematic modules, or those that were long since defunct, from the core so they don't have to deal with them. Python went through this exercise as well: https://peps.python.org/pep-0594/

The core Perl devs will also go to great effort to test against the entirety of CPAN ("blead breaks CPAN" testing), but some of those distributions haven't been maintained in decades so they have to draw the line somewhere. Fortunately if it's on CPAN then it's forkable (for the most part) so someone can take up maintenance.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: