Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  
Pages: 1 ... 48 49 [50] 51 52 ... 158

Author Topic: Tech News. Automation, Engineering, Environment Etc  (Read 265682 times)

Egan_BW

  • Bay Watcher
  • Normalcy is constructed, not absolute.
    • View Profile
Re: Tech News. Automation, Engineering, Environment Etc
« Reply #735 on: May 07, 2017, 03:55:34 pm »

It's a scan of an armadillo toy model thing, from the Stanford datasets https://graphics.stanford.edu/data/3Dscanrep/
And as noted on that page, along with the bunny, one of the few canonical models where distortion isn't generally frowned upon.
BRB, creating new religion to worship that bunny and that armadillo.
Logged

Max™

  • Bay Watcher
  • [CULL:SQUARE]
    • View Profile
Re: Tech News. Automation, Engineering, Environment Etc
« Reply #736 on: May 08, 2017, 05:28:50 am »

Reading news stories here and there about customs giving people a shitty choice--let them check your phone or be detained arbitrarily--makes me wonder how difficult it would be to whip up a faked-command-line-gui-lockscreen to gate access to the rest of the phone, and fuck with the typical commands someone used to CLI conventions would try.

>_
>help
Would you like to play a game?
[Yes/No] n
Are you sure?
[Yes/No] y
So you do want to play a game, or you are sure you don't want to play a game?
[Yes/No] n
Ah, so we're playing a game then, wonderful.
[Player 1 Name] exit
Hello exit, would you like to go left or right?
[Left/Right] l
You have been eaten by a grue.
[Continue?] n
Shutting down now.
*phone shuts off, similar dialog trees repeat when it is turned back on, entering the text of the 4th amendment gives a functional CLI which can be used to boot up the rest of the phone*

Bonus would be using a cheap backup phone you can let go of if they decide to try and be extra dickish about it.
Logged

Reelya

  • Bay Watcher
    • View Profile
Re: Tech News. Automation, Engineering, Environment Etc
« Reply #737 on: May 10, 2017, 06:49:33 pm »

https://yro.slashdot.org/story/17/05/10/1441225/police-to-test-app-that-assesses-suspects

Quote
Police in Durham are preparing to go live with an artificial intelligence (AI) system designed to help officers decide whether or not a suspect should be kept in custody, BBC is reporting. The system classifies suspects at a low, medium or high risk of offending and has been tested by the force. It has been trained on five years' of offending histories data. One expert said the tool could be useful, but the risk that it could skew decisions should be carefully assessed. Data for the Harm Assessment Risk Tool (Hart) was taken from Durham police records between 2008 and 2012. The system was then tested during 2013, and the results -- showing whether suspects did in fact offend or not -- were monitored over the following two years. Forecasts that a suspect was low risk turned out to be accurate 98% of the time, while forecasts that they were high risk were accurate 88% of the time.

I think the advantages of relying on such an app far outweigh the risks. Decisions are already heavily skewed by personal biases/feelings/malintent of individual police officers, and officers are daily being asked to make complex decisions about stuff they can't possible know the ramifications for. Not every beat policeman can be expected to be an expert in domestic violence, crime/race relations and all of the rest, to make completely objective decisions which minimize harm to society as a whole. It's basically not a do-able task, and thus AI for police decisions is going to become a huge thing.

Having an app like this that gives it's own 2 cents is also one less excuse for crooked cops to screw with people because of stuff that's unrelated to the crime.

This app will give a risk assessment that's independent of any personal biases, and will see police resources better targeted to high-risk individuals, so you'd see less young people being locked up, but more of those they do lock up being ones who were the likely offenders.
« Last Edit: May 10, 2017, 06:52:42 pm by Reelya »
Logged

Reelya

  • Bay Watcher
    • View Profile
Re: Tech News. Automation, Engineering, Environment Etc
« Reply #738 on: May 10, 2017, 08:37:37 pm »

If I'm reading this right, the measure of risk that they used is whether a suspect would then later be arrested for a different or perhaps the same crime?


If so, I am immediately terrified of its use in the US. Think about how many minor drug arrests there are. As far as I can tell, the only "risk" it assesses is the risk of the suspect, after being released, committing a crime. Considering how many people in the US are arrested for minor drug offenses, you can imagine how this might skew the thing into saying that every 20-something male in the US is a "high risk" individual.

That's really not going to happen. Only a very smaller percentage of people, even 20-something males, will be arrested in a specific year. if an AI was rating your likelihood of arrest based purely on age and gender, then it's only going to give perhaps a 1% likelihood of you being arrested purely on that.

And if other factors are accounted for, that actually reduces any assessed likelihood that arises from broad characteristics such as race, age, gender. Because you've factored those things out from the background noise, and that will in fact reduce the importance of e.g. gender on the data analysis.

e.g. males might be more likely to be carrying a handgun, and the AI might determine that people who are caught carrying a handgun have higher likelihood of committing later offences. But because that factor has now been accounted for, the "base" factor related to just being male is reduced: males who didn't have handguns are less likely to commit offenses than an "average male", and the new base-level assessment of risk by just "being male" is reduced to that new baseline of "males without handguns".

By a process of trying out different data to see which is the better predictor, you reduce the reliance on correlations with broad-brush predictors (since those correlate with the other predictors in a non-random fashion, their impact on the final assessment is reduced).
« Last Edit: May 10, 2017, 08:52:02 pm by Reelya »
Logged

MrRoboto75

  • Bay Watcher
  • Belongs in the Trash!
    • View Profile
Re: Tech News. Automation, Engineering, Environment Etc
« Reply #739 on: May 10, 2017, 08:51:31 pm »

There's an anime about that somewhere
Logged
I consume
I purchase
I consume again

Reelya

  • Bay Watcher
    • View Profile
Re: Tech News. Automation, Engineering, Environment Etc
« Reply #740 on: May 10, 2017, 08:55:08 pm »

Since explicit data would need to be entered into the system it would be fairly easy to prevent racial profiling being part of the system, just don't allow the system access to race data.

The AI would then need to find proxy measures that correlate with future arrests. But those proxy measures are in fact going to reduce the amount that can be purely associated with race. e.g. a black college student is less likely to be arrested in a single year than a black gang member. By factoring in educational level you are in fact reducing the amount that can be associated with race, even if the cops have been racially profiling people based on race. That's because racial profiling isn't an independent variable, you're much more likely to be picked up if your circumstances are in fact different. And when those arrests are accounted for based on other variables, they reduce the amount than can be directly associated with race, even if you allow race to be entered as a variable.
« Last Edit: May 10, 2017, 09:09:43 pm by Reelya »
Logged

Reelya

  • Bay Watcher
    • View Profile
Re: Tech News. Automation, Engineering, Environment Etc
« Reply #741 on: May 10, 2017, 09:24:23 pm »

Well, if you take the current data, including racial profiling biases then there would be some component purely correlated with race, and other components correlated with secondary characteristics that are themselves correlated with race.

If you then eliminate race from your decision process, but allow the secondary characteristics, that's going to change your future data set such that the amount purely associated with race is going to tend towards zero. So yes even with a biased sample base, the process will reduce the amount of racial profiling. There will still be biases in there, but they're not going to be very strong when you eliminate the main one that caused the initial bias. e.g. hypothetically "has dreadlocks" could be a good predictor of being arrested, and that might remain as a correlation if you disallow race data from being entered, but other concrete factors are going to in fact chip away at how well "has dreadlocks" explains variable arrest rates.

alway

  • Bay Watcher
  • 🏳️‍⚧️
    • View Profile
Re: Tech News. Automation, Engineering, Environment Etc
« Reply #742 on: May 11, 2017, 02:37:43 am »

Such systems have already been used and are already resulting in perpetuation of racism in an automated manner. Moreover, these systems:
1. Are Proprietary; defendants don't get to question how it comes to its conclusion, as the inner workings are trade secrets.
2. Are Opaque; see prior. Without extensive access to the tools and a degree in statistics, no one will ever find out about its biases except through years of abuse after the fact.
3. These systems only give back results similar to what you were getting before *at best.* Some minorities are heavily over-represented in prisons in the US, and the trained goal, if everything works exactly as planned, is to match those results. Even assuming the company making it does its absolute best at this and trains the system perfectly, the best you can expect from automating a biased policing system is an automated biased policing system.

This all assumes a system that works as designed and *merely recreates the existing systemic biases*. But with the secrecy from 1 and 2, a third party can't even determine if the system is just as biased as the old way of doing things, let alone worse.
Well, if you take the current data, including racial profiling biases then there would be some component purely correlated with race, and other components correlated with secondary characteristics that are themselves correlated with race.

If you then eliminate race from your decision process, but allow the secondary characteristics, that's going to change your future data set such that the amount purely associated with race is going to tend towards zero. So yes even with a biased sample base, the process will reduce the amount of racial profiling. There will still be biases in there, but they're not going to be very strong when you eliminate the main one that caused the initial bias. e.g. hypothetically "has dreadlocks" could be a good predictor of being arrested, and that might remain as a correlation if you disallow race data from being entered, but other concrete factors are going to in fact chip away at how well "has dreadlocks" explains variable arrest rates.
As for this bit, it's just not true. The patterns you're training on have bias baked into them -- any well trained system will pick up on them, and represent that in whatever way it most effectively can so long as it can find any proxy for that information. If there is literally any way to represent a factor that results in a 10x likelihood of arrest (as being black in the US does https://www.usatoday.com/story/news/nation/2014/11/18/ferguson-black-arrest-rates/19043207/ ), it will find those links and represent them through any means necessary. And herein lies the problem -- there are tons of very subtle and interrelated systems at work that determine whether a person released from prison will be re-arrested; most of which are outside the realm of anything being captured by the system. Reduce this to a tractable statistical problem, and the end result will be a load of uninterpreted noise plus a strong signal of easily captured bias. It won't be a recidivism predictor, it will be a black male predictor, used to lock people up with no more explanation than "the computer said so, and computers can't be biased." Or as some machine learning folks on twitter put it: "Bias laundering"
Logged

Starver

  • Bay Watcher
    • View Profile
Re: Tech News. Automation, Engineering, Environment Etc
« Reply #744 on: May 12, 2017, 05:48:35 pm »

Cue "Update your computer now! Microsoft has detected that you need to apply this patch! Click here <points to mircosoft.co domain> now and follow all the installation instructions!" emails.
Logged

misko27

  • Bay Watcher
  • Lawful Neutral; Prophet of Pestilence
    • View Profile
Re: Tech News. Automation, Engineering, Environment Etc
« Reply #745 on: May 12, 2017, 07:49:08 pm »

Odds that this is just a really elaborte set up by Microsoft to make people update.

Jokes aside, this looks pretty devestating. The damage I'm hearing about in Britian sounds pretty horrible, but apparently it's everywhere.
Logged
The Age of Man is over. It is the Fire's turn now

Sergarr

  • Bay Watcher
  • (9) airheaded baka (9)
    • View Profile
Re: Tech News. Automation, Engineering, Environment Etc
« Reply #746 on: May 12, 2017, 08:02:20 pm »

Welp, hopefully disabling SMBv1 will be enough to avoid getting fucked.
Logged
._.

martinuzz

  • Bay Watcher
  • High dwarf
    • View Profile
Re: Tech News. Automation, Engineering, Environment Etc
« Reply #747 on: May 12, 2017, 08:24:37 pm »

According to my newspaper, it's not just hospitals that are hit. A lot of banks in Spain are locked out of accessing their funds and transfer system.
This attack is no joke.

Inb4 billion trillion dolllar lawsuits hit NSA for letting this get out in the open.

EDIT: still, them hitting hospital systems is a million times worse than them disabling banks, even though the economic damage from the latter could well be several factors higher. Disabling hospitals is just a plain crime against humanity. I hope they catch who did that and put them in jail for life.
« Last Edit: May 12, 2017, 08:27:41 pm by martinuzz »
Logged
Friendly and polite reminder for optimists: Hope is a finite resource

We can ­disagree and still love each other, ­unless your disagreement is rooted in my oppression and denial of my humanity and right to exist - James Baldwin

http://www.bay12forums.com/smf/index.php?topic=73719.msg1830479#msg1830479

Reelya

  • Bay Watcher
    • View Profile
Re: Tech News. Automation, Engineering, Environment Etc
« Reply #748 on: May 12, 2017, 08:24:47 pm »

It won't be a recidivism predictor, it will be a black male predictor, used to lock people up with no more explanation than "the computer said so, and computers can't be biased." Or as some machine learning folks on twitter put it: "Bias laundering"

I was actually hoping that it would be trained on data from other nations. if you train on data from many nations then your specific biases are going to be evened out.

However, even given proxy biases, one AI trained on US data was twice as likely to flag black vs white criminals, but the difference in arrests rates when human officers make the assessment is 3.6 black to 1 white. So I do stand by what I said before, that the AI being given proxy information is going to damp down on the human bias quite a bit, and if you run the AI for a generation then it's going to have an error term based on it's own assessments vs reality, and that's going to damp down further on the error due to bias.

Like self driving cars it doesn't need to be perfect to be an improvement, just better than we are, and with error-correction headed in the right direction.

wierd

  • Bay Watcher
  • I like to eat small children.
    • View Profile
Re: Tech News. Automation, Engineering, Environment Etc
« Reply #749 on: May 12, 2017, 08:56:55 pm »

Concerning the "Shadow Brokers" release of offensive state malware:

This (what is happening RIGHT NOW) is EXACTLY the reason why "Sitting" on exploits instead of properly, and discretely disclosing them to vendors, with the threat of a serious state beat-down if they dont fix it pronto, IS A FUCKING STUPID AS SHIT IDEA.

As proven by demonstration, the NSA and pals are *NOT* invulnerable to unsanctioned release or compromise of their archival systems. Hoarding this shit where people can get to it (like afore mentioned Shadow Brokers) instead of releasing it discretely to vendors and assuring proper public safety (LIKE THEIR MANDATE SAYS), is how you end up hurting millions of people, and causing billions to trillions of dollars in damages.

But no-- Cue the people who will clamor all over how "essential" and "necessary" it is to hoard dangerous exploits like this for purposes of cloak and dagger statecraft bullshit. 
Logged
Pages: 1 ... 48 49 [50] 51 52 ... 158