News:

And we're back!

Main Menu

The Off Topic Topic

Started by Korea, March 10, 2009, 06:24:26 AM

Previous topic - Next topic

The Brain

OK I read the link a little, and I don't get the impression that it's an unintended effect. If there are no small UK fora then children won't get abused on small UK fora.
Women want me. Men want to be with me.

garbon

Quote from: Sheilbh on December 19, 2024, 06:39:06 AMBeen banging the drum for ages - it's a dreadful law that everyone supports :lol: :bleeding:

Is his stance likely that they will just start going after everything? I'm not sure who would be able to do all that work.
"I've never been quite sure what the point of a eunuch is, if truth be told. It seems to me they're only men with the useful bits cut off."
I drank because I wanted to drown my sorrows, but now the damned things have learned to swim.

Sheilbh

Worth saying I think this is quite possibly also coming to the EU. The OSA is pretty similar to the Digital Services Act - particularly the bits that harmonise national law on online saftey (rather than the EU-wide stuff about the platforms). I think those are still being developed on how it'll be implemented nationally and child sexual abuse material is a big part of that.

For example they refer to the EU "mere conduit" point, that has been narrowed under the DSA and I don't think it would cover forums. It's been narrowed to basically only really cover technical caching, hosting and transmission.

It's really frustrating. Everyone I know in the relevant sectors are really worried about and opposed to this whole legislative package both in the EU or UK. But it's really difficult to make the case politically because basically every campaign following a tragedy has got involved (for example, eating disorder content on Instagram), or it's the "solution" to disinformation, or it'll fight CSAM. And ultimately when you've got grieving parents, removing Musk's influence from politics and stopping CSAM on one side it's really tough to argue against even though it's a very bad law on really important issues.

It is, also, a massive enhancement of state power over online content, in ways that I think are quite concerning.
Let's bomb Russia!

Sheilbh

#93228
Quote from: garbon on December 19, 2024, 07:51:00 AMIs his stance likely that they will just start going after everything? I'm not sure who would be able to do all that work.
Well, yeah. Ofcom in particular are not the most fiercesome regulator in the world.

Worth saying there was basically a pass the parcel about this law because none of the relevant regulators (Information Commissioner, OfCom and Competition and Markets Authority) wanted it. It is an enforcement nightmare which creates lots of legal duties and is also, I think, quite subjective.

So you could just take the risk - but that'd be a risk that you kind of know you're in breach.

I think the interesting thing he's pointing to is something that I think is a big issue in the UK and EU more generally right now which is the way we've started doing regulations is increasing costs, and I'm not sure it's necessarily helping. So for example - the legal requirement here is that he does a risk assessment and takes steps to mitigate those risks. That's basically externalising the cost of regulation to companies. They're also the ones making the judgement calls - this is fine if you're a big company with a massive, specialised legal team (i.e. the platforms) but otherwise will probably just be a gravy train for consultants offering really bad advice (I see this all the time with GDPR stuff).

Even then you might get it wrong and if there were an issue causing the regulator to enforce - they may turn around and say that the assessment was a bit duff and the mitigations weren't strong enough. But this is happening across sectors - the fire safety stuff around Grenfell was principles based fire safety regulation with external consultants assessing what is safe, environmental and equalities laws are full of obligations to do assessments and devise mitigations, it's in data protection law, KYC and anti-bribery assessments for relatively small organisations, it's even the Martyn's Law requiring venues with a capacity of more than 800 to do terrorism assessments and prepare plans in the event of a terrorist attack.

All of these things are good in principle and it's tough to argue against them without sounding like you oppose fire safety standards, the environment, combating CSAM, the mother of a man killed in a terrorist attack etc. But I think government sees them as basically no cost because they don't have to do anything. All it's doing is moving that work and costs to everyone else and it's not clear it's producing better results v dodgy consultants making a lot of money.

And as you say all of those costs only really apply if you're a well-meaning business trying to follow the law. If you do not give a fuck or have the money to fight any challenge, then you might just take the risk.

I think it's a big, costly problem.

Edit: But as I say I don't think legislators see it as a problem or costly because they're not bearing the costs and who could oppose doing proper risk assessments and mitigating those risks. Obviously campaigners also don't have an issue because they're campaigners caring about their big issue and not (unlike legislators) meant to look at the bigger picture.
Let's bomb Russia!

Tamas

Quote from: garbon on December 19, 2024, 07:51:00 AM
Quote from: Sheilbh on December 19, 2024, 06:39:06 AMBeen banging the drum for ages - it's a dreadful law that everyone supports :lol: :bleeding:

Is his stance likely that they will just start going after everything? I'm not sure who would be able to do all that work.

The threat is enough.

garbon

Quote from: Tamas on December 19, 2024, 08:21:17 AM
Quote from: garbon on December 19, 2024, 07:51:00 AM
Quote from: Sheilbh on December 19, 2024, 06:39:06 AMBeen banging the drum for ages - it's a dreadful law that everyone supports :lol: :bleeding:

Is his stance likely that they will just start going after everything? I'm not sure who would be able to do all that work.

The threat is enough.

For him
"I've never been quite sure what the point of a eunuch is, if truth be told. It seems to me they're only men with the useful bits cut off."
I drank because I wanted to drown my sorrows, but now the damned things have learned to swim.

Sheilbh

Quote from: garbon on December 19, 2024, 08:22:05 AMFor him
Yeah for him and other businesses who start from the position of trying to follow the law.

Obviously for anyone in the public sector it's not even negotiable.

Edit: Which, again, I think is part of the problem - the cost is being taken by the organisations trying to do the right thing (do what the law says they should).
Let's bomb Russia!

The Brain

Quote from: garbon on December 19, 2024, 08:22:05 AM
Quote from: Tamas on December 19, 2024, 08:21:17 AM
Quote from: garbon on December 19, 2024, 07:51:00 AM
Quote from: Sheilbh on December 19, 2024, 06:39:06 AMBeen banging the drum for ages - it's a dreadful law that everyone supports :lol: :bleeding:

Is his stance likely that they will just start going after everything? I'm not sure who would be able to do all that work.

The threat is enough.

For him

For anyone who isn't an outlaw outfit it's not only about the threat but about not doing illegal things. You can't knowingly break the law.
Women want me. Men want to be with me.

garbon

Quote from: The Brain on December 19, 2024, 08:56:51 AM
Quote from: garbon on December 19, 2024, 08:22:05 AM
Quote from: Tamas on December 19, 2024, 08:21:17 AM
Quote from: garbon on December 19, 2024, 07:51:00 AM
Quote from: Sheilbh on December 19, 2024, 06:39:06 AMBeen banging the drum for ages - it's a dreadful law that everyone supports :lol: :bleeding:

Is his stance likely that they will just start going after everything? I'm not sure who would be able to do all that work.

The threat is enough.

For him

For anyone who isn't an outlaw outfit it's not only about the threat but about not doing illegal things. You can't knowingly break the law.

That's fair and I was wondering if there's also interpretation at play. When I've now just googled about this, I can't find any one speaking about the danger to small online forums that isn't discussing this one particular instance. You'd think there would be a lot of chatter from more than once instance, right?

Then I stumbled upon this Telegraph article. Dee Kitchen is the same as Velocio whose link we all just read.While I think the online safety bill is still garbage (and will only be haphazardly enforced), I'm not sure I'd call an organisation that has 250,000 users across its 300 sites (forums?) just a small forum. That sounds like a pretty sizable operation that if you want better online safety controls...well this seems like a relevant target?

https://www.telegraph.co.uk/business/2024/12/17/hundreds-of-websites-to-shut-down-under-chilling-internet/?ICID=continue_without_subscribing_reg_first

QuoteHundreds of websites to shut down under UK's 'chilling' internet laws

Hundreds of websites will be shut down on the day that Britain's Online Safety Act comes into effect, in what are believed to be the first casualties of the new internet laws.

Microcosm, a web forum hosting service that runs 300 sites including cycling forums and local community hubs, said that the sites would go offline on March 16, the day that Ofcom starts enforcing the Act.

Its owner said they were unable to comply with the lengthy requirements of the Act, which created a "disproportionately high personal liability".

The new laws, which were designed to crack down on illegal content and protect children, threaten fines of at least £18m for sites that fail to comply with the laws.

On Monday, Ofcom set out more than 40 measures that it expects online services to follow by March, such as carrying out risk assessments about their sites and naming senior people accountable for ensuring safety.

Microcosm, which has hosted websites including cycling forum LFGSS since 2007, is run as a non-profit funded by donations and largely relies on users to follow community guidelines. Its sites attract a combined 250,000 users.

Dee Kitchen, who operates the service and moderates its 300 sites, said: "What this is, is a chilling effect [on small sites].

"For the really small sites and the charitable sites and the local sports club there's no carve-out for anything.

"It feels like a huge risk, and it feels like it can be so easily weaponised by angry people who are the subject of moderation.

"It's too vague and too broad and I don't want to take that personal risk."

Announcing the shutdown on the LFGSS forum, they said: "It's devastating to just ... turn it off ... but this is what the Act forces a sole individual running so many social websites for a public good to do."

Microcosm, a free service, is used to run hundreds of forums dedicated to cycling groups, technology and local communities.

After the announcement, one member wrote: "This is just devastating." Others suggested moving the website's hosting overseas, or onto a chat app such as Discord.

The Online Safety Act, passed by the last government, is designed to stop social media users accessing terrorist material and abuse images, and will introduce strict age checking requirements to ensure that underage users are not able to access the services.

The laws introduce extra requirements on larger social media sites and search engines, but Ofcom says it will cover online services "from large and well-resourced companies to very small 'micro-businesses'".

In a consultation response earlier this year, the social media site Reddit said the online safety laws could lead companies to quit Britain, saying: "If thresholds are premised on certain characteristics and functionalities alone, small-to-medium sized platforms will bear disproportionate economic, operational and competitive disadvantages when placed in the same category as much larger companies."

An Ofcom spokesman said: "We know there's a diverse range of services in scope of the UK's new online safety laws. The actions sites and apps will need to take will depend on their size and level of risk."

"We're providing support to online service providers of all sizes to make it easier for them to understand – and comply with – their responsibilities."
"I've never been quite sure what the point of a eunuch is, if truth be told. It seems to me they're only men with the useful bits cut off."
I drank because I wanted to drown my sorrows, but now the damned things have learned to swim.

crazy canuck

I understand the argument and the concerns against this sort of law, but in a world where the majority of citizens now obtain their information from online sources and social media in particular, how does that environment get regulated?


Sheilbh

#93235
Quote from: crazy canuck on December 19, 2024, 09:32:50 AMI understand the argument and the concerns against this sort of law, but in a world where the majority of citizens now obtain their information from online sources and social media in particular, how does that environment get regulated?
So actually a lot of my issue with OSA and the DSA is that there are very broad obligations that are subjective.

For example in the OSA a "duty of care" to users, in particular in relation to "legal but harmful" content. I think that is potentially quite worrying and also moving judgements that I think should be subject to democratic and legal control to the compliance functions of already too powerful companies. Similarly the rights under the DSA to identify anonymous accounts and take down posts are exceptionally broad and are generically applied to "public authorities" - including police, for example. There is no requirement to obtain a court order. We've already seen some very controversial uses by French police - the rubber will hit the road when we start seeing it being used by, say, "public authorities" under Meloni or Orban's control.

My own view would be that these laws do point to the correct answer - but it's unpalatable, politically difficult and going to be very heavily lobbied against by tech firms. I would treat the platforms as basically publishers - especially as they're starting to move more into AI. So using their models in order to generate new content. An example just this week Apple's AI summarisation tool of news alerts, for some users, incorrectly summarised a set of BBC breaking new notifications as "Luigi Mangione shot" - he wasn't. But that's not a "mere conduit" that is publishing a summary. (Edit: And I've mentioned before their own business model argues against this, possibly unlike forums. The mere conduit protection is for ISPs or telcos - organisations that basically are pipes looking at metadata in order to direct it correctly, or just indexing etc. The entire business model of Google, Meta etc is looking at the content in order to personalise content and advertising in the future - that's what they monetise. And I don't think you can have it both ways - if you're making money by looking at, analysing and selling what your users are doing on your platforms then you can't disown responsibility for what users are doing on your platforms.)

I'd enforce existing laws online - we've seen this a little after the riots over the summer here (which Musk is obsessed with). If it is illegal to give rioters a map with a target and tell them to burn that place down in the real world (and it is), then it is also illegal on Twitter and those people should be prosecuted. Same for the forums in France Gisele Pelicot's husband was using for his activities - it would be a crime in person, it's a crime online. That would require more resources and a change in law enforcement approach. I'm often reminded of government boasts here that excluding online identity theft and fraud, crime is at a record low - which excludes the single largest and fastest growing category of crime.

And I'd break them up because I think they have an unfair market advantage on advertising. It is insane to me that we have the most valuable companies in the world making their money primarily through online advertising, while actual news publishers are going to the wall because of the collapse of online advertising revenue - which means that increasingly the most viable business model is a paywall/subscriber model which not all people can afford or want to buy. These are two sides of the same coin that at exactly the point that reported, legaled information is becoming a luxury good, people are instead relying on free platforms (that are not publishers and don't care about reporting or legaling) for their information. I also think it undermines their mere conduit argument. But we buy too much into their own hype. Most of these companies are basically vertically integrated online advertising oligopolies. And the state should break them up. I also think that could lead to a larger share of the online advertising revenue going to news publishers which would support them being able to look at other business models and moving away from good information only being available for those who can afford it.

(But this is an obsession of mine because I don't think the problem was that people started getting their information from social media companies, I think it's that the social media companies' practices destroyed the economic model of getting your information from anywhere else...)
Let's bomb Russia!

crazy canuck

Thank you for the thoughtful post.  I like the idea of treating sites more like publishers and the enforcement of existing laws is particularly appealing.


Barrister

Quote from: The Brain on December 19, 2024, 08:56:51 AMFor anyone who isn't an outlaw outfit it's not only about the threat but about not doing illegal things. You can't knowingly break the law.

Of course you can knowingly break the law.  People do it every day.  Anytime someone exceeds the posted speed limit you're breaking the law.  Every time you jaywalk you break the law.

There's a legal maxim - de minimis non curat lex.  In English "the law does not deal with trifles".  On a certain level we just understand and trust that the authorities and the courts are not going bother going after trifling things.  Lets say you out-and-out steal a grape from the supermarket.  Cops are probably not going to press charges.  Prosecutors aren't likely to prosecute charges.  And even if they do - the judge will almost certainly throw the case out citing de minimis non curat lex.

So I looked at the forum in question - it's the London Fixed Gear and Single Speed - a forum dedicated to single speed bicycles in the London area.  Now it does look reasonably successful site with thousands of registered users, but still a pretty small fish in the sea.

There's certainly an argument to make that they seem to likely be able to just fly along as they have been.  As long as they make good faith attempts to deal with abusive or illegal materials if such ever does show up, the authorities would seem highly unlikely to go after this site, even if it is not in compliance with the law.

Now obviously it's the owner's call to make, not mine.  And obviously I'm not authorized to practice law or give legal advice in England.  But if I was the owner I think I'd be tempted to just double check my content moderation strategies but pretty much just carry on as before.

(and yes, I'm aware there's some irony in a person doing my job saying "well just trust the system".  Make of that what you will, but sometimes you can just trust the system)
Posts here are my own private opinions.  I do not speak for my employer.

Josquius

For the record I have nothing to do with this London bike forum. Another forum just linked to their writeup amidst their panic.
██████
██████
██████

Sheilbh

Another example I'd give is small press publishers. Some groups I'm in, including Irish printers, in an absolute flap about the new EU General Product Safety Regulation. I'm 99% sure it doesn't really affect them and even more sure that none of them will have to do anything.

But if you're a small operator and you want to do the right thing you probably do worry a bit and maybe end up going a bit above and beyond.

As I say I do think a lot of these types of regulation basically favour the big players who can afford legal advice on what to take a risk on and what not to.
Let's bomb Russia!