October 15th, 2003, 08:24 AM
Are there any organisations that have a lab setup, before deployment? mean, a day or 2 (maybe even a week) of testing application behaviour after patches have been deployed from SUS Server, with a staged progressive roll-out across the enterprise? Makes sense if organisational policies and proceedures are in place.....always good idea to back up and archive before patches are deployed. if one fails to plan one plans to fail?
HO$H Pagamisa. Pro Amour Ludi....
October 15th, 2003, 08:46 AM
I have to say that this auto update is already in there. you just have to switch it on. thats what i seem to notice in windows 2000. its off by default, if you swtich it on it will inform you once updates are available. for home users and non critical systems i guess its a good thing but for critical systems where down time is not an option then this is not a good idea as you will need to patch them in the labs first and test it out to see if anything is broken by the patch.
October 15th, 2003, 01:44 PM
S1lv3rW3bSurf3r: we do a dev to stage to production setup with a test lab for all patches first, in my office this is vital, as Isaid before we have a lot of custom in house com objets and MS patches tend to break at least one of them.
Who is more trustworthy then all of the gurus or Buddha’s?
October 15th, 2003, 01:48 PM
I would never use the auto update feature. You never know exactly what it is updating and with my experience with MS's rollback feature. Lets just say "format c:\".
October 15th, 2003, 02:40 PM
I am working towards getting the auto update approach agreed, but from a SUS server, so we have an opportunity to test and validate patches with out software before rolling them out. With the desktops, this should work reasonably successfully, given our normal configuration.
What I will need to work harder on is the server updates. I think we could get these to work too, particularly as we don't have any 24/7 critical systems ( xmaddness - you are absolutely right, auto update on your systems would be a seriously bad idea ). I have to question what is worse - an hours downtime of one system, or another network infection. Given that the last time we were hit, one of our offices was completely out for an afternoon, I think we can handle the occasional problem.
I know this wouldn't work for everyone. It is an assessment of the most likely dangers to our systems that has led me to this approach. I have also not had any problems with any MS patches on my desktop machine ( except a certain Office SP, which had to go ).
October 15th, 2003, 02:54 PM
autoupdate alternate for large corporations!
I'm surprised that I have not seen this alternative. It is possible to setup your own windows update server. When you do this you have the flexibility to select which patches you want to install, and you can also have SMS push the updates to the computers during non peak hours when the users will not be interrupted by it.
From what I've heard the server setup is relatively easy to install and this will fix ALOT of vulnerabilities to a massive amount of computers and quickly. I don't really know of many other ways you can say you've patched 1800 computers in a day!
Just a suggestion!
Have a great day!
October 15th, 2003, 02:59 PM
I assume you are talking about SUS? It is a decent solution, and free but not fool proof, it dosn't allways detect what patches need to be installed correctly (the first RPC patch comes to mind). We use a product from alteris that seems to work better.
Who is more trustworthy then all of the gurus or Buddha’s?
October 15th, 2003, 03:01 PM
This is what I do now:
Disclaimer: I don't consider my web sites, email etc. that are available from the public network to be mission critical...... We do no e-commerce and the hit rate for the sites will hardly have the phones ringing off the hook because people can't donate their $10......
So - if the machine faces the internet it get's auto-updated at 3:00 am daily regardless of the potential downing of the system. 3:00am is the time that the system state and daily backups have completed. In almost 2 years of doing this none of the machines have dropped - but then again I'm running pretty "plain jane" machines.
My reasoning: It's a whole lot easier to regenerate a server from scratch or from a backup because a patch blocking a hole messed it up than it is to carry out a forensic investigation to determine which of my other 650 machines got "jumped off" onto..... and if I miss something in that investigation.... I'm screwed and I don't even know it......
I would suggest that if your internet facing machines are not mission critical - ie: cost the company mucho dinero and you your job if they are down for more than a blink of an eye then you should probably use the auto-update right after a backup sequence daily too.
But that's just the way I do things.......
I noticed a few of you mentioning how long the updates can take, reading the article I got the impresssion that "critical" updates will not include such things as service packs etc. and that they will be being packaged with some new system that reduces them in size by 30-80%..... Now that can't be bad at all........
Don\'t SYN us.... We\'ll SYN you.....
\"A nation that draws too broad a difference between its scholars and its warriors will have its thinking done by cowards, and its fighting done by fools.\" - Thucydides
October 15th, 2003, 03:12 PM
I'm not sure if its a SUS server or not to be honest with you. I haven't seen the server or worked on it, just heard it discussed and the theory behind it sounds solid. I'm assuming that you could set the server to auto download the patches, and then an admin can select the patches that they deem as necessary.
Lets be honest, not all of the machines need .NET framework patch, and the average user will never use journal viewer and things like that!
But with the bandwidth / speed issue, instead of alot of computers downloading at 2:00am (while bandwidth at this time really doesn't matter anyways), You have one server downloading the patches and sending out the links locally. This can be 100Mbps or if your lucky enough to have OC3 / OC12, or any of those tools it would be much faster.
Anyhoo its all just different ways to get to the same result. We are all looking to get to the end result, which is a secure computer for the customers. Personally I prefer having some control over watch patches are installed, rather than having it automatically installing as the auto updater does.
And if I remember correctly one of the features of the auto update is to download the patches automatically and then ask to install them, and I like the ability to say when I want it to download. Just have to be careful to select the time to download rather than letting the computer do it whenever it wants!
This is a great discussion, good topic Tiger!
*edit* also I didn't see in the article but when will the 30 - 80 % smaller patches be coming out??
October 16th, 2003, 09:31 AM
I think there is a huge difference between "my computer" and corporate desktops ( keeping corporate servers out of this for the moment ). On my personal computer ( both at home and at work ) I like to have a say in what is installed. At the very least I want to be aware of what patch state my computer is in, so then when something goes wrong, I don't waste lots of time just because it is a known issue with a certain patch. So for "my computer", I would agree with thadbme, and have control over them.
However on our corporate computers, the last thing I want is for the users to be able to reject patches. Maybe we have the most stupid or technophobe users on the planet, but they would still reject a "Critical Windows Security Patch", because it doesn't apply to them, or they are too busy. I think there is a danger in confusing what I want to do ( as a techie ) and what we need to do corporately. In the same way that I mustn't mistake what we do on our 10/5 servers with what needs to be done on 24/7 servers.