News:

And we're back!

Main Menu

Amazon drivers poop in bags and pee in bottles

Started by The Larch, March 26, 2021, 09:28:02 AM

Previous topic - Next topic

viper37

Quote from: The Larch on March 26, 2021, 11:52:29 AM
I would like to know if these well known issues with how Amazon treats its workers has affected any of our shopping habits. Personally I've avoided Amazon like the plague since these kind of things started becoming public knowledge.
About as much as revelations on Wal Mart's treatment of its employees change physical shopping habits... Not much.
I don't do meditation.  I drink alcohol to relax, like normal people.

If Microsoft Excel decided to stop working overnight, the world would practically end.

Admiral Yi

Quote from: Sheilbh on March 26, 2021, 09:47:00 AM
So there's I'd say three main ways a camera is racist - if it doesn't have the data to work with all races; if the historic data it's trained on is biased; or if it is trained to be biased. I think the issue here is particularly around the second point but I could be wrong.

There's issues with facial recognition technology, for example, that has primarily been trained on white faces not recognising the faces of black or Asian people. So if your training dataset is limited in that way (and this was a big problem with a lot of early AI) then it literally will not work for all races which is crucial if it's a gateway to services or something. But then to take that to the other extreme there's the facial recognition/surveillance technology developed by Chinese companies (who supply Western goverments and businesses too) that have been specifically trained to identify Uyghurs.

The other issue which I think is probably more relevant here is about the data used to train the AI in systems. Basically if the raw information used to train the system reflects existing discrimination and racism then the AI will re-produce results with that discrimination and racism. Except because it's by a machine or an algorithm it will have the patina of being a neutral, tech solution even though what it has learned through the input data is to re-produce human biases.

So there's loads of algorithms that can be used in recruitment for example, but you can't just train it on historic data or you're just making an electronic version of the recruitment biases that might have existed (I think Amazon tried to build something like this based on their historic hiring decisions and found the algorithm would auto-reject loads of good women applicants because there was historic bias in their recruitment).

There's a lot of thinking and potential regulation around how to do AI in a fair way when it comes to making decisions about individuals (which is restricted in Europe and other countries) and how to sort of ethically build AI to behave as we want it to - which is generally in a neutral way - rather than just reflecting pre-existing biases. One of the issues though is the lack of transparency within the algorithm that can make it difficult to understand or explain what it is doing and how it does it - this tends to be the big hurdle in Europe where you need to "meaningfully explain" automated decisions to individuals - and often, at this stage, we can't.

Also, separately, employee monitoring is very, very difficult in Europe and is creepy :ph34r:

I don't see how any of these things relate to monitoring on the job performance.

Sheilbh

:lol: Fair - my point was these are some of the ways a camera can be racist.
Let's bomb Russia!

Admiral Yi

Quote from: garbon on March 26, 2021, 10:58:17 AM
I would assume it's because we mostly agree the general situation is a problem... so not much to discuss there?

I feel like I don't have enough information to comment meaningfully.

Berkut

I think CC is right, the reason we have such terrible labor laws (and we do, and we know this because he says so, and his well known record on understanding exactly how America works in every detail is not questionable) is because I felt like I was allowed to comment on one particular detail in a post, which obviously means I am definitely on Amazon's side in all other details. Clearly.

His ability to just really understand what it is to be an American, in America, and everything about America, is just uncanny.
"If you think this has a happy ending, then you haven't been paying attention."

select * from users where clue > 0
0 rows returned

Berkut

Quote from: garbon on March 26, 2021, 10:58:17 AM
Quote from: crazy canuck on March 26, 2021, 10:27:42 AM
A story about how ill treated Amazon workers pee in bottles because they do not get sufficient work breaks.  Is striking that the labour laws would be so bare as to permit this sort of thing to occur.  But in true Languish form, a US poster objects to a throw away use of the word racist rather than discussing what the article is actually about. Perhaps this is an example of why US labour laws are so abysmal?

I would assume it's because we mostly agree the general situation is a problem... so not much to discuss there?

Don't lose focus on what is important here!
"If you think this has a happy ending, then you haven't been paying attention."

select * from users where clue > 0
0 rows returned

grumbler

Quote from: Berkut on March 26, 2021, 02:01:35 PM
I think CC is right, the reason we have such terrible labor laws (and we do, and we know this because he says so, and his well known record on understanding exactly how America works in every detail is not questionable) is because I felt like I was allowed to comment on one particular detail in a post, which obviously means I am definitely on Amazon's side in all other details. Clearly.

His ability to just really understand what it is to be an American, in America, and everything about America, is just uncanny.

CC is a lift-wing mirror-Trumpeter.  Everything must be OUTRAGEOUS if he doesn't approve of it.  E.g., if you ever question the cost-effectiveness of any Green Party policy anywhere, you must want to DESTROY PLANET!
The future is all around us, waiting, in moments of transition, to be born in moments of revelation. No one knows the shape of that future or where it will take us. We know only that it is always born in pain.   -G'Kar

Bayraktar!

Zanza

#22
The lack of bathroom breaks (and them urinating in bottles) for delivery drivers is also reported in Germany.

It is hard to address even if as society we want to. And we should want to as delivery drivers are both essential and have virtually no bargaining power as the job is unskilled. Having a group of people with these working conditions should not be tolerated.

The drivers are often "independent" subcontractors, so normal employee rules would not apply to them, unless we force the delivery services to hire all their drivers. That would be a very severe market intervention, which would likely fall foul of EU single market regulation and the idea of contractual freedom.

The pressure from high number of packages to be delivered per day, subcontractors and employees being liable for the delivered goods and subcontractors having to provide a van at their own cost is problematic. Not sure how to address that. I guess the liability part at least for employed drivers should be addressable with mandatory insurance bought by the employers or so.

Also the very nature of delivery driving makes it virtually impossible to provide private restrooms for them close to their routes. Corona made it worse with public restrooms closed. That said, more accessible public restrooms - coming with their own issues - seems to be a realistic policy to help these people out.

Tamas

Terrible working conditions is one thing.

Monitoring workers is another. I am not sure employing cameras for the purpose is so much different for having the people employed for that purpose doing it in person, but it could be.

garbon

Quote from: Tamas on March 28, 2021, 05:10:25 AM
Terrible working conditions is one thing.

Monitoring workers is another. I am not sure employing cameras for the purpose is so much different for having the people employed for that purpose doing it in person, but it could be.

Well for one, the AI behind the cameras, could catch many more 'infractions' than a human overseer. It also will have no compassion.
"I've never been quite sure what the point of a eunuch is, if truth be told. It seems to me they're only men with the useful bits cut off."
I drank because I wanted to drown my sorrows, but now the damned things have learned to swim.

Tamas

Quote from: garbon on March 28, 2021, 05:28:36 AM
Quote from: Tamas on March 28, 2021, 05:10:25 AM
Terrible working conditions is one thing.

Monitoring workers is another. I am not sure employing cameras for the purpose is so much different for having the people employed for that purpose doing it in person, but it could be.

Well for one, the AI behind the cameras, could catch many more 'infractions' than a human overseer. It also will have no compassion.

I highly doubt the incidents flagged by the "AI" if we want to call a glorified algorithm that, would not be reviewed by a human before decisions are made on it.

Zanza

The cameras can give immediate feedback to the driver:
QuoteThe systems can then provide real-time feedback, telling a driver to take a break or keep their eyes on the road.
https://www.google.com/amp/s/www.theverge.com/platform/amp/2021/3/24/22347945/amazon-delivery-drivers-ai-surveillance-cameras-vans-consent-form

garbon

Quote from: Tamas on March 28, 2021, 05:32:22 AM
Quote from: garbon on March 28, 2021, 05:28:36 AM
Quote from: Tamas on March 28, 2021, 05:10:25 AM
Terrible working conditions is one thing.

Monitoring workers is another. I am not sure employing cameras for the purpose is so much different for having the people employed for that purpose doing it in person, but it could be.

Well for one, the AI behind the cameras, could catch many more 'infractions' than a human overseer. It also will have no compassion.

I highly doubt the incidents flagged by the "AI" if we want to call a glorified algorithm that, would not be reviewed by a human before decisions are made on it.

Depends on how data is reported back. Surely there's a difference between a manager noting an unauthorised break being taken and understanding why vs. data report that shows driver x was late y amount of times in the past x months.

I've been running a training series at my work where I keep attendance so we will know who attended the training and who has not received it.  If I just sent that grid to my manager without any commentary, there are multiple people who might get a stern talking to who just happened to have already had leave scheduled at the time of a couple training sessions.
"I've never been quite sure what the point of a eunuch is, if truth be told. It seems to me they're only men with the useful bits cut off."
I drank because I wanted to drown my sorrows, but now the damned things have learned to swim.

Tamas

Quote from: garbon on March 28, 2021, 06:26:18 AM
Quote from: Tamas on March 28, 2021, 05:32:22 AM
Quote from: garbon on March 28, 2021, 05:28:36 AM
Quote from: Tamas on March 28, 2021, 05:10:25 AM
Terrible working conditions is one thing.

Monitoring workers is another. I am not sure employing cameras for the purpose is so much different for having the people employed for that purpose doing it in person, but it could be.

Well for one, the AI behind the cameras, could catch many more 'infractions' than a human overseer. It also will have no compassion.

I highly doubt the incidents flagged by the "AI" if we want to call a glorified algorithm that, would not be reviewed by a human before decisions are made on it.

Depends on how data is reported back. Surely there's a difference between a manager noting an unauthorised break being taken and understanding why vs. data report that shows driver x was late y amount of times in the past x months.

I've been running a training series at my work where I keep attendance so we will know who attended the training and who has not received it.  If I just sent that grid to my manager without any commentary, there are multiple people who might get a stern talking to who just happened to have already had leave scheduled at the time of a couple training sessions.

It all depends on how it will be used in practice. If it is used to have drivers/employees fired on the spot when the "AI" gives a warning then that's obviously retarded and evil. If the "AI" gives a warning to be reviewed by a human supervisor, well, I don't see too much problem with it, although it can be abused and overdone and create unnecessary stress. But plenty of corporate processes do that already all over the world.