Forum Settings
       
Reply To Thread

rl minority reportFollow

#1 Aug 15 2015 at 10:12 AM Rating: Decent
Scholar
***
1,323 posts
Thanks to Snowden we now know that the no-fly list is essentially made-up on the fly ( ah-ha ).

I have to admit, I did not expect that. I expected a better attempt to justify places a list so secretive that getting unbanned is virtually impossible. Thanks Snowden:>
____________________________
Your soul was made of fists.

Jar the Sam
#2 Aug 17 2015 at 4:09 PM Rating: Decent
Encyclopedia
******
35,568 posts
This bit is basically moronic disconnected BS:

Quote:
The Obama administration is trying to prevent further disclosures about the program's basis for denying Americans the right to travel based on secret evidence and an opaque process. FBI counter-terrorism assistant director Michael Steinbach defended the no-fly list's dependence on security through obscurity: "If the Government were required to provide full notice of its reasons for placing an individual on the No Fly List and to turn over all evidence (both incriminating and exculpatory) supporting the No Fly determination, the No Fly redress process would place highly sensitive national security information directly in the hands of terrorist organizations and other adversaries."

Basically: Our process only works if we get to keep it secret. If our adversary understood it, it would be easy to defeat."

It's a unique view of knowledge-construction that denies the traditions of the scientific method stretching back to the Enlightenment, which holds that the only way to root out flaws in your assumptions and methodologies is adversarial peer review. It's a view that says that aviation security is unique among all technical disciplines, capable of reaching valid considerations through a process of friendly, internal review by parties who believe in its mission and all draw their paychecks from the same agency.


I'm by no means defending the program itself, but the basic concept of concealing the data points you use as red flags in your system from those who might want to avoid being red flagged is hardly unscientific. Kinda has nothing to do with science at all, hence the disconnect. Attack it on civil liberty grounds, or privacy grounds, or any other grounds. Trying to argue that it somehow flies in the face of accepted methodologies is just ridiculous.

Fair or not, there are far more operatives tracking our behavior and assessing likely future actions and predilections than the US government. The science of looking at people's social metadata and making assessments about them is actually pretty darn robust. If it wasn't, private companies wouldn't see the cost benefit of using it to target advertisements to people they think may be more interested in buying product A instead of product B. Such data analysis is a multi billion dollar industry. To suggest that it doesn't work is kinda silly. Does it work as well for predicting who might be inclined towards terrorism? No clue. But, as I mentioned above, the primary issue isn't that they're using these sorts of techniques in the first place, but whether the actions they take as a result (say putting someone on a no-fly list) should be allowable. Declaring the method to be unscientific is the wrong approach IMO.
____________________________
King Nobby wrote:
More words please
#3 Aug 17 2015 at 5:00 PM Rating: Decent
Scholar
***
1,323 posts
gbaji wrote:
This bit is basically moronic disconnected BS:

Quote:
The Obama administration is trying to prevent further disclosures about the program's basis for denying Americans the right to travel based on secret evidence and an opaque process. FBI counter-terrorism assistant director Michael Steinbach defended the no-fly list's dependence on security through obscurity: "If the Government were required to provide full notice of its reasons for placing an individual on the No Fly List and to turn over all evidence (both incriminating and exculpatory) supporting the No Fly determination, the No Fly redress process would place highly sensitive national security information directly in the hands of terrorist organizations and other adversaries."

Basically: Our process only works if we get to keep it secret. If our adversary understood it, it would be easy to defeat."

It's a unique view of knowledge-construction that denies the traditions of the scientific method stretching back to the Enlightenment, which holds that the only way to root out flaws in your assumptions and methodologies is adversarial peer review. It's a view that says that aviation security is unique among all technical disciplines, capable of reaching valid considerations through a process of friendly, internal review by parties who believe in its mission and all draw their paychecks from the same agency.


I'm by no means defending the program itself, but the basic concept of concealing the data points you use as red flags in your system from those who might want to avoid being red flagged is hardly unscientific. Kinda has nothing to do with science at all, hence the disconnect. Attack it on civil liberty grounds, or privacy grounds, or any other grounds. Trying to argue that it somehow flies in the face of accepted methodologies is just ridiculous.

Fair or not, there are far more operatives tracking our behavior and assessing likely future actions and predilections than the US government. The science of looking at people's social metadata and making assessments about them is actually pretty darn robust. If it wasn't, private companies wouldn't see the cost benefit of using it to target advertisements to people they think may be more interested in buying product A instead of product B. Such data analysis is a multi billion dollar industry. To suggest that it doesn't work is kinda silly. Does it work as well for predicting who might be inclined towards terrorism? No clue. But, as I mentioned above, the primary issue isn't that they're using these sorts of techniques in the first place, but whether the actions they take as a result (say putting someone on a no-fly list) should be allowable. Declaring the method to be unscientific is the wrong approach IMO.


Hmm, I obviously disagree. Few would disagree that Israel has pretty good security on their airports. How they do it is not exactly a secret, and security through obscurity was repeatedly shaken in its foundations over the past few years ( I have stopped counting smaller breaches ). It is not that it is unscientific. It is that it does not work well ( and then there is a question liberty, privacy and all that fun stuff ).

The key phrase in your case may the "accepted methodologies". Just because they are accepted does not mean they are right. Not too long ago it was accepted methodology to let out blood to cure people, classify homosexuality as a mental disorder and to add cocaine to CocaCola just to name a few.

As for the data analysis.. yeah, it is a big industry, and, frankly, it seems like companies are collecting more data than they know what to do with. It does not help that some of the management does not understand what they can do what said data. Worse yet, they seem not to understand the difference between data and information ( some of the more amusing examples being metrics that include lines of code ).That is how you get this weird push to try to record every single ******* thing to ensure that not one person ever sits idly by ( Amazon being an interesting example ). I see big data bubble as all the other ones, and this one is slowly getting ready to burst.

Now, to address the method as unscientific. Would you say that classifying you as a terrorist on a glorified hunch is scientific?


____________________________
Your soul was made of fists.

Jar the Sam
#4 Aug 17 2015 at 6:12 PM Rating: Decent
Encyclopedia
******
35,568 posts
angrymnk wrote:
Hmm, I obviously disagree. Few would disagree that Israel has pretty good security on their airports. How they do it is not exactly a secret, and security through obscurity was repeatedly shaken in its foundations over the past few years ( I have stopped counting smaller breaches ). It is not that it is unscientific. It is that it does not work well ( and then there is a question liberty, privacy and all that fun stuff ).


The last sentence is all that really matters. My issue is with declaring the process to be unscientific, which was basically the entire thrust of the article and is, IMO, the wrong approach if you want to critique said process.

Another issue is your completely nonsensical use of the phrase "security through obscurity". That phrase refers to a security methodology which relies on people simply not knowing where you hid your stuff. It doesn't at all refer to a security methodology that relies on people not knowing what your methodology is. Trying to apply it to the latter case would invalidate all "secret knowledge" based security as insecure. So your password isn't secure because it merely requires people to not know what it is. Um... Silly.

Security through obscurity would be not having any password at all and simply relying on the fact that the internet is big so the odds of someone finding and stealing your stuff is small (or just hoping terrorists can't find potentially damaging targets to hit out of all the less important ones out there). Which *is* a terrible security model. Again though, it's not remotely applicable to not wanting people to know what pattern of behaviors you look for to red flag people as potential terrorists. In the same way not wanting to tell people your password is a perfectly reasonable action to take which one might expect would increase one's odds of securing whatever they're doing.

Quote:
The key phrase in your case may the "accepted methodologies". Just because they are accepted does not mean they are right. Not too long ago it was accepted methodology to let out blood to cure people, classify homosexuality as a mental disorder and to add cocaine to CocaCola just to name a few.


Then take issue with the article, not me. The article attempted to argue that the method used was somehow not the right one to use, not because of moral/ethical concerns, or even because it might not work as well as others, but simply by declaring it to "fly in the face" of accepted scientific method.

I'll also point out that the article writer makes assumptions about what constitutes scientific method that aren't technically correct. While peer review can be an important part of the process, it's not what makes the method "scientific". Um... And the whole thing is just a disconnect anyway. What's with trying to paint something as unscientific? It just doesn't fit here.

Quote:
Now, to address the method as unscientific. Would you say that classifying you as a terrorist on a glorified hunch is scientific?


Setting aside that "no-fly" doesn't equal "terrorist", sure. Why not? Most scientific advances (I'd argue all of them) started with a "glorified hunch". Again, my issue is even trying to define this in a "scientific vs non-scientific" manner. I just see that as a real disconnect. Remember that scientific method is a method. It's a process by which you try things, see what works, and then make adjustments and try again. Someone coming into the middle of that process and declaring that since the results today aren't 100% perfect that whatever is being done is "non scientific" is pretty silly. You could more or less erase every single technological advancement in the history of man if you tried to apply that standard.

And no, an open peer review process has nothing to do with it. A single person, working alone in his basement, continually refining his work over time, despite acting in complete secrecy can be using and almost certainly *is* using "scientific method". As long as he's testing his results and attempting to make changes designed to improve the next result, he's using scientific method. Success or failure doesn't make it science. Doing it in the dark versus the light of day doesn't make it science. The author basically seems to want to apply an open standard to all science, but that's just not what science is.

Hence my issue with that line of reasoning. Say you don't like it because it's profiling people and that's bad. Or because of the privacy issues. Or that it's unethical to restrict flight based on said profiling. Heck. Say you don't like the lack of transparency because you think it increases the likelihood of corruption, abuse, and cover ups of said things. Those are all good arguments to make. Saying that it's unscientific? Really really weak IMO.

Edited, Aug 17th 2015 5:16pm by gbaji
____________________________
King Nobby wrote:
More words please
#5 Aug 17 2015 at 6:54 PM Rating: Decent
Scholar
***
1,323 posts
gbaji wrote:


Hence my issue with that line of reasoning. Say you don't like it because it's profiling people and that's bad. Or because of the privacy issues. Or that it's unethical to restrict flight based on said profiling. Heck. Say you don't like the lack of transparency because you think it increases the likelihood of corruption, abuse, and cover ups of said things. Those are all good arguments to make. Saying that it's unscientific? Really really weak IMO.

Edited, Aug 17th 2015 5:16pm by gbaji


Sigh, the really sad thing is though that whenever I mention the p-word, I do not even get a shrug anymore. It is almost as bad a cops who refer to people who actually know their rights as 'crazy constitutionalists'. If anyone cares to respond at that point, it is a response along the lines of 'i m clean so i dont care' and 'i m too unimportant a target'.

At this point, and I am sad to admit it, I am perfectly willing to try a different approach and see if it sticks.
____________________________
Your soul was made of fists.

Jar the Sam
#6 Aug 17 2015 at 7:54 PM Rating: Decent
Encyclopedia
******
35,568 posts
angrymnk wrote:
gbaji wrote:


Hence my issue with that line of reasoning. Say you don't like it because it's profiling people and that's bad. Or because of the privacy issues. Or that it's unethical to restrict flight based on said profiling. Heck. Say you don't like the lack of transparency because you think it increases the likelihood of corruption, abuse, and cover ups of said things. Those are all good arguments to make. Saying that it's unscientific? Really really weak IMO.


Sigh, the really sad thing is though that whenever I mention the p-word, I do not even get a shrug anymore. It is almost as bad a cops who refer to people who actually know their rights as 'crazy constitutionalists'. If anyone cares to respond at that point, it is a response along the lines of 'i m clean so i dont care' and 'i m too unimportant a target'.


Profiling? Or privacy?

I guess I may be in the minority here, because I believe strongly in privacy rights, but I don't have as much of an issue with profiling based on information you do give out. I suppose where I have the most issue with it though is the creeping process of introducing yet more things we must do that require that we put more information out there about us, which in turn makes profiling more prevalent. I think that we need to have a firm line in terms of what is private/personal information that no one should require us to disclose (or if it is disclosed, the second party promises to keep confidential). What I find discouraging is what appears to be a lot of misunderstanding/disagreement over what is and isn't "private". Which leads to many people fighting against/for the wrong things IMO.


As to the effectiveness of profiling itself, I think a strong case can be made that it is quite possible to create broad predictions about behavior across a group of people based on past actions. I think the concept gets a bad rep because it's normally assumed to be a shortened term for "racial profiling". But profiling itself is neither bad nor good. It simply is. And frankly, it's something we all do every day without thinking about it. Unless you are utterly oblivious to what others around you are doing, and make no effort to interpret those actions, you are profiling. You're looking at someone standing at a crosswalk staring at the light signal and conclude that person is probably waiting for the light to turn so they can cross. You just profiled that person. Profiling is simply looking at data you know about someone and predicting future actions that person may take based on that information. There are a host of actions that tend to operate in connection with each other, and it's not hard to see the patterns that result. You just have to pay attention.


Again, my issue isn't with the profiling itself, but the methods of obtaining the data used, and the actions being taken as a result. If the government wants to rummage through publicly available information looking for red flags and then take a closer look at those whose patterns indicate something potentially nefarious, that's perfectly fine. And if they find something sufficiently incriminating or suspicious, then they can take the next step into actual (warranted) surveillance. But to broadly restrict people from engaging in a normal bit of commerce/travel merely for matching a pattern? I have an issue with that. And IMO that's the part of this that we need to focus on. It's the bits on the ends that matter. Do we infringe people's privacy gaining data? And do we infringe other rights with that data absent strong evidence of actual criminal intent? What speculations we draw on otherwise open/accessible data shouldn't be the issue. What actions our government is allowed to take based on that data *is*

Edited, Aug 17th 2015 7:11pm by gbaji
____________________________
King Nobby wrote:
More words please
#7 Aug 18 2015 at 12:32 AM Rating: Good
***
1,159 posts
For a guy that rambles on about state rights you sure do love to jump on the Federal Government's ****.
____________________________
Timelordwho wrote:
I'm not quite sure that scheming is an emotion.
#8 Aug 18 2015 at 2:14 PM Rating: Decent
Prodigal Son
******
20,643 posts
Kavekkk wrote:
For a guy that rambles on about state rights you sure do love to jump on the Federal Government's ****.

He loves that small government ****.
____________________________
publiusvarus wrote:
we all know liberals are well adjusted american citizens who only want what's best for society. While conservatives are evil money grubbing scum who only want to sh*t on the little man and rob the world of its resources.
#9 Aug 19 2015 at 8:24 PM Rating: Decent
Encyclopedia
******
35,568 posts
Kavekkk wrote:
For a guy that rambles on about state rights you sure do love to jump on the Federal Government's ****.


My issue is the obsession with the government actually looking at information they have to make assessments, when our focus should be on A) how the government obtains the information, and B) what actions the government takes. The act of profiling isn't itself a problem. Heck. It's logical and reasonable. It's kind of stupid to have a set of data and choose to not look at it. Assuming we'd prefer a government that makes intelligent and informed decisions, this just makes zero sense.

I just think that people look at the wrong part of the issue is all. Has nothing to do with big or small government. Assuming we all agree that law enforcement is one of the roles of government, then the issue isn't with it doing so, but how it does so. Right?
____________________________
King Nobby wrote:
More words please
#10 Aug 26 2015 at 1:52 PM Rating: Decent
Lunatic
******
30,086 posts
I'm by no means defending the program itself, but the basic concept of concealing the data points you use as red flags in your system from those who might want to avoid being red flagged is hardly unscientific. Kinda has nothing to do with science at all, hence the disconnect. Attack it on civil liberty grounds, or privacy grounds, or any other grounds. Trying to argue that it somehow flies in the face of accepted methodologies is just ridiculous.

Well, there's the surface patina of 'if you let them know what not to do, they'll avoid being on the list' thought, but then, a millimeter deeper is the 'they won't be doing things you don't want them to do' part, where preventing actual harm is probably more useful than 'it's a seeeekrit'. This is why most police cars aren't unmarked.
____________________________
Disclaimer:

To make a long story short, I don't take any responsibility for anything I post here. It's not news, it's not truth, it's not serious. It's parody. It's satire. It's bitter. It's angsty. Your mother's a *****. You like to jack off dogs. That's right, you heard me. You like to grab that dog by the bone and rub it like a ski pole. Your dad? Gay. Your priest? Straight. **** off and let me post. It's not true, it's all in good fun. Now go away.

#11 Aug 26 2015 at 2:20 PM Rating: Excellent
Liberal Conspiracy
*******
TILT
Smasharoo wrote:
This is why most police cars aren't unmarked.

That YOU know of...
____________________________
Belkira wrote:
Wow. Regular ol' Joph fan club in here.
#12 Aug 26 2015 at 4:16 PM Rating: Decent
Encyclopedia
******
35,568 posts
Jophiel wrote:
Smasharoo wrote:
This is why most police cars aren't unmarked.

That YOU know of...


Well, and police who are engaged in surveillance generally aren't sitting in marked cars either.

Smash does have a point, but it's one thing to say that sometimes our focus should be on telling people what they should not be doing via open enforcement, and yet another to determine whether this is actually one of those times. Frankly, I don't think we can equate someone considering speeding, who might be deterred upon sight of a patrol car, with someone considering blowing up a building. I think we can all agree that the presence of a cop may make people think twice about committing a crime right then, but it generally does not prevent them from committing that crime later when the cop isn't around. So that methodology is really only useful when the overt authority is present. One then has to measure the relative potential infringement involved with subtle surveillance designed to detect those planning certain crimes against the potential infringement of requiring security personnel to be present *everywhere* in order to protect us from those crimes.

This decision is not made in a vacuum. It's not enough just to say that data examination and profiling represents potential infringement, but to compare it against an alternative approach. And in this case, I suspect that the "look for red flags in data sets" is less problematic than "place armed guards everywhere a terrorist might attack". That doesn't preclude us also placing armed guards at high value targets, but what about malls, cafes, grocery stores, gas stations, and frankly every target that someone might decide to go after where armed guards aren't? Even if it were practical to directly protect everyplace all the time, would we want to live in a society like that?

Edited, Aug 26th 2015 3:17pm by gbaji
____________________________
King Nobby wrote:
More words please
#13 Sep 01 2015 at 5:17 PM Rating: Good
GBATE!! Never saw it coming
Avatar
****
9,957 posts
gbaji wrote:
If the government wants to rummage through publicly available information looking for red flags and then take a closer look at those whose patterns indicate something potentially nefarious, that's perfectly fine. And if they find something sufficiently incriminating or suspicious, then they can take the next step into actual (warranted) surveillance.


subject: gbaji

loner
tech geek
gobs of money
frequent trips out of USA
frequent trips to countries with Muslim majority population.


These are sufficiently incriminating or suspicious as far is I'm concerned. Likely the NSA, too. I'm glad you are open to being strip searched on every flight you ever take again.
____________________________
remorajunbao wrote:
One day I'm going to fly to Canada and open the curtains in your office.

Reply To Thread

Colors Smileys Quote OriginalQuote Checked Help

 

Recent Visitors: 191 All times are in CST
Anonymous Guests (191)