Forum Home
PC World Chat
 
Thread ID: 116458 2011-03-04 21:16:00 Latest PCworld Rubbish wainuitech (129) PC World Chat
Post ID Timestamp Content User
1183565 2011-03-07 01:40:00 First, let me state the following post is my opinion, and my opinion only, based on my experiences working in and around the IT industry over a number of years. And yes, I'm aware the "general" discussion and testing has been primarily in regard to home user versions of products.

Personally I disagree with the testing methodology in one or two key areas - and they're areas that are very hard to test, admittedly. No, they're not to do with detection rates, but rather stability and reliability of the application in general.

The two primary sites I am currently involved with are of similar size,~370 and ~400 devices, the former with Symantec's Endpoint Protection 11, the latter with ESET NOD32 4. Neither really have issues with lack of detection, where we do have a major difference though is in management overhead. In the past 6 months, we've had 13 instances where a failure directly related to the SEP client or management server have caused downtime or intervention requirements - things such as requiring to uninstall / reinstall the client, or manually push out definition files because it just gets grumpy for no reason - for a total of 44 hours of "additional" time spent supporting the product, across the 370-odd devices. Contrast this with 1 instance for the ESET site, with a total of about 1.5 hours, for 400 devices. Bear in mind this is time on top of our standard management overhead for these products such as reviewing management consoles for alerts etc, which between the two is a very similar amount of time.

Not only do these issues increase the time and effort required to support the product, but in many cases they present major security openings when they occur - things such as a SEP client that mysteriously decided it no longer had valid definitions, refused to update from the SEPM server, or from Symantec's own liveupdate server, meant that during the troubleshooting period - which eventually required a complete uninstall of the client, clearing out of several temp folders and registry locations, and a finally redeploy of the client - there was no real security present on that workstation.

In both cases we are using only the antivirus/antispyware modules, not full firewall / network intrusion suites.

In my own personal experience over the past 12 years or so, Symantec (and some other major brands) products, present something of an inconsistency in that when they are working 100% and such, actually do a good job, but have a very high tendency to fall over for a wide range of reasons - and in many cases, it's not immediately obvious there is an issue, leading to windows of vulnerability that often as not catch "Average Joe" users out and they get infected. So they end up with virus/malware infections, on top of their non-functional install of Antivirus_Product_001.

Additionally, I have found that the "quality" of detection provided by some of these companies is very inconsistent, and results vary depending greatly on time of testing. One day they may be very good, and have a very high detection rate, but then in some cases the response time to major new threats can be very slow, again leading to vulnerability windows. However, some companies' proactive detection methods have shown to be far more effective at reducing this risk window than others.

Lab testing of AV products does not, in my opinion, give a clear result of their real-world performance under user conditions, and this innate issue is my major disagreement with the methodologies used by many tests - and admittedly it would be very difficult to coordinate a "fair" test under user conditions. Overall, I have seen a much higher relative number of instances of issues (be it detection failures, or directly-product-related failures of other kinds) with Symantec and McAfee products than with ESET products.

Unfortunately in this case I feel that PC World are in a "damned if you do, damned if you don't" type situation - the Symantec product's biggest failings are in areas that aren't directly tested, and are difficult to test thoroughly in a short time span.

However, that doesn't, in my opinion, mean they are good products, and I couldn't in all fairness recommend any friend or family member to purchase a Symantec AV or security suite as I personally do not believe they are the best option available.

In terms of the specific testing in question, I would be very interested to see if there is a detailed description somewhere of the testing methodologies used? I've only had a cursory read of the article to be honest, so I have quite possibly missed something relevant. Without this, it's hard (and unfair) to be directly critical of the way the things that were tested were done.
inphinity (7274)
1183566 2011-03-07 01:48:00 Hi Wainutech,

1 . Could you please provide references for your claim?
2 . Did you have any queries about Andreas Marx' methodology?

I'm not sure why people are using "Lab tested" as some kind of epithet .

What lab testing can't account for is user behaviour .

The Zeditor . Just have a look around the forum, there are Numerous post/ threads with screen shots proving the claims . See post 42 ( . co . nz/showthread . php?t=106859&page=5" target="_blank">pressf1 . co . nz) - while this is from last year, at the time Norton Was listed as Number one again by a PCWorld article .

Like yourself as mentioned, no time to do more testing myself and its very time consuming to do it fully .


To finish the response;

This is exactly the type of answer I expected, a side step reply . ;)

WHY? because as you have said your self "What lab testing can't account for is user behavior . "

That alone is enough of a statement to confirm, lab tests are controlled, Real world usage is not, and Labs cant simulate that .

AND

"I'm not sure why people are using "Lab tested" as some kind of epithet . "

Oh come on - Did you actually read what you wrote ? :groan:

You're the editor of a magazine, that prints articles from various sources and own in house results - the average reader will read an article and if the magazine says product X is the best they will believe it or at least expect it to be the best .

Its normally the people who are in this business and deal with this sort of problem daily, and see the "real world results" of what certain software does or doesn't do, they are the ones that say the results are false .

In other words, going from the comments above, the article thats written is a load of misconceptions or opinions and shouldn't be taken 100% seriously, because the average user does things that the labs know will cause different results .

Maybe there should be a warning in the articles either at the beginning or at the end, stating something along the lines of "these are experts testing and the average user will more than likely have different results"


BUT I will ask again - Repeat the above question in the previous post with a slight addition,

Why is it that the average home user ( which these articles are mostly read by) are still getting infections from so called "good software" .

I actually know the answer ---- the software doesn't do what it claims 100% where a home user is involved, which is misleading to the readers .

We all know that you guys aren't stupid, you must be aware that Techs, repair places see the opposite to what articles say .

I'll add in another question:
If a person brought Norton software from the recommendation of an article in PCWorld, then the PC got badly infected - and the said person contacted the author -- What would the response be to that person asking why their computer got infections if the software is so good - and Will PCworld pay to have their Computer repaired?


I can guess the response - something like :lol:

Cheers,
Wainuitech
wainuitech (129)
1183567 2011-03-07 02:06:00 If they're talking about "lab results" rather than real-world, then it's not really relevant to a magazine that's aimed arguably at the consumer and advice for the consumer, rather than advice for labs ;)

We got a few infections lately here. One machine for some strange reason was still running Norton 2010 (Buggered if I know why?) and another was running McAfee. The results were this "real world" problem got past them both.
www.imagef1.net.nz
www.imagef1.net.nz

Here's what would ideally happen:
Infect a machine with a ton of crap. It's not difficult. Find somebody on the Forums here even to loan you a machine for a week.
1) Ghost the HDD
2) Install AV 1, scan. Install NOD32 and see what it picks up that AV1 didn't. Re-ghost from the source infected image.
3) Install AV 2, scan. Install NOD32 and see what it picks up that AV2 didn't. Re-ghost from the source infected image.
4) Install NOD32, scan. Install every other AV program known to man and see if they can pick up anything that NOD32 didn't.

Good plan? :D
Chilling_Silence (9)
1183568 2011-03-07 02:17:00 If they're talking about "lab results" rather than real-world, then it's not really relevant to a magazine that's aimed arguably at the consumer and advice for the consumer, rather than advice for labs ;)

We got a few infections lately here. One machine for some strange reason was still running Norton 2010 (Buggered if I know why?) and another was running McAfee. The results were this "real world" problem got past them both.
www.imagef1.net.nz
www.imagef1.net.nz

Here's what would ideally happen:
Infect a machine with a ton of crap. It's not difficult. Find somebody on the Forums here even to loan you a machine for a week.
1) Ghost the HDD
2) Install AV 1, scan. Install NOD32 and see what it picks up that AV1 didn't. Re-ghost from the source infected image.
3) Install AV 2, scan. Install NOD32 and see what it picks up that AV2 didn't. Re-ghost from the source infected image.
4) Install NOD32, scan. Install every other AV program known to man and see if they can pick up anything that NOD32 didn't.

Good plan? :D

.... while graphing (real-time) CPU utilisation....
johcar (6283)
1183569 2011-03-07 02:17:00 Here's what would ideally happen:
Infect a machine with a ton of crap. It's not difficult. Find somebody on the Forums here even to loan you a machine for a week.
1) Ghost the HDD
2) Install AV 1, scan. Install NOD32 and see what it picks up that AV1 didn't. Re-ghost from the source infected image.
3) Install AV 2, scan. Install NOD32 and see what it picks up that AV2 didn't. Re-ghost from the source infected image.
4) Install NOD32, scan. Install every other AV program known to man and see if they can pick up anything that NOD32 didn't.

Good plan? :D

To be honest, that's only half a plan. It doesn't take into account testing of the AVs ability to detect the initial intrusion and prevent the infection, only it's ability to detect & remove it post-infection.
inphinity (7274)
1183570 2011-03-07 02:21:00 Here's what would ideally happen:
Infect a machine with a ton of crap . It's not difficult . Find somebody on the Forums here even to loan you a machine for a week .
1) Ghost the HDD
2) Install AV 1, scan . Install NOD32 and see what it picks up that AV1 didn't . Re-ghost from the source infected image .
3) Install AV 2, scan . Install NOD32 and see what it picks up that AV2 didn't . Re-ghost from the source infected image .

Thats basically how I have done some testing with one exception .

The AV being tested is on a clean Machine, restored from an image, so there is no possibility of anything being on the test machine . Once tested, the clean machine is re imaged, and another AV loaded - once again, so no cross detections from quarantined infections .

The infected drive is slaved to the test machine after the AV has been loaded, updated then scanned with the AV, as most of the time the infected drive wont run an AV due to its infections . If its a customers PC, not to good saying I need it for a few weeks to do some testing :)


To be honest, that's only half a plan . It doesn't take into account testing of the AVs ability to detect the initial intrusion and prevent the infection, only it's ability to detect & remove it post-infection . Thats true, and it would take a bit of time to do multi tests with various AV's .

Then again - if the AV was any good in the first place, it wouldn't get infected - :D And as we all know, infections are getting past and infecting even though installed are "Good / The Best" programs .
wainuitech (129)
1183571 2011-03-07 02:30:00 It'd take a bit of time, but it'd give you something more solid to base things on.

Another solution:
Find websites that infect you and then visit them (from a clean, snapshotted virtual machine?)
Do this with a dozen-odd known viruses / trojans etc
Sound like a plan?
Chilling_Silence (9)
1183572 2011-03-07 02:38:00 Just warning you guys that Zara is currently on her way to an event at Sony Australia for a couple days and will likely not reply until she gets back. Siobhan Keogh (16063)
1183573 2011-03-07 02:42:00 Bugger :(

Perhaps that's a good chance for me to build a VM or two of my own and do some testing!!

Still, thanks for the heads up :)
Chilling_Silence (9)
1183574 2011-03-07 02:45:00 Just warning you guys that Zara is currently on her way to an event at Sony Australia for a couple days and will likely not reply until she gets back .

Nice work if you can get it!!

Playing with the latest Sony toys!!!
johcar (6283)
1 2 3 4 5 6 7 8 9