A Tuple By Any
Other Name Would Smell As Sweet by
Bill Wunder
It's always interesting to see
what database vendors have to say about each other. From here it
doesn't look like one is any better than another in one area for
sure: the marketing distortions, gerrymandering and outright
malevolence used to convince us their db engine is better than
the other guys. Generally, when I am reading something authored
by Microsoft or someone I know to be a Microsoft proponent, I
enable my internal 'quality of message' filter when I come upon
comments about Oracle or IBM or any other database product.
Likewise If a SQL Server comment comes from an admitted Oracle
advocate I tend to look at those comments with skepticism.
To begin with I only know about a
thimble full about administering the Oracle database and it has
been close to 10 years since my last AS/400 gig. That means you
could tell me anything about 10g or DB2 and my only real
recourse is to trust you are telling the truth or not believe a
word you say. Since almost all comparative literature I see
contains something less than the truth about SQL Server -
whether authored by a pro or a con (pun intended)- I'm sticking
with notion that the same is true for the guy that conducts
comparative reviews/test about MySQL or Cache or any other
database products. At least if I know who is buying lunch I know
the thesis that the author must validate.
The type of message that is more
dangerous is the case where a supposedly neutral third party
does a comparison between products. I mean when Microsoft starts
talking about TPC-C I know they've snuck a few nuggets on the
scales or when IBM spouts comparative TCO metrics I know who
will look the best with out the need to read the words. But when
an eWeek or Gartner proclamation offers up the same sort of
analysis it's not so easy to tell who the favorite is and who
the patsy is. It is however still pretty easy to pick out the
inaccuracies in the part of the discussion I am more familiar
with in far too many cases. This leaves me convinced that
selecting a database platform these days is an academic exercise
in being somebody's fool. If you are going to make the decision
based solely on research, the decision is who's fool you want to
be not who's database will do the best job for your needs.
Having said that I'm a little
reluctant to even appear to pick on any particular author or
venue but would like to provide a little evidence for my
premise. I get a large portion of my database news right from
sswug.org.
Stephen does an awesome job of finding quality database related
links and sharing them with us in the daily news letter. I get
my general daily news dose from Yahoo! Among the aggregated
feeds that I watch on Yahoo! are the headlines from CMP's
TechWeb
site.
Today in the TechWeb headlines I
read with some interest an article "More
Database per Dollar" by
Susana Schwartz. The essence of the story is the accurate
premise that the SMB database market is growing ever more
competitive. To help make the point, Ms Schwartz summarizes the
findings of a supposedly disaffected third party's comparison of
Oracle 10g and SQL Server 2000 in terms of which is more cost
effective to administer then provides some interesting comments
from the folks at the testing organization, Microsoft, Oracle.
All in all a short and interesting read that really did not
attempt to support or refute the findings of the study. Rather
the article set the table for the reader to recognize the
growing complexity and competitiveness in the SMB database space
and included a few comments from Microsoft's Tom Rizzo and
Oracle's Bob Shimp.
There was a link to the
study , "Comparative
Management Cost Study of Oracle Database 10g and Microsoft SQL
Server 2000". The very same study
that Stephen told us about in the daily newsletter on May 11th ("SQL
Server Admin. vs Oracle Admin - Who's more complex?")
and again throughout the week with
feedback from many
in the sswug.org community. I
can't resist adding my comments to the fray. My favorite part of
the study is the finding that the difference between the
products equates to a 30% salary differential. I guess that
means you can pay the Oracle DBA 30% less but the study really
didn't say that, it just offered the number. Aside from that
er... ummm... ambiguous interpretation of the test results I had
more than a few questions about the test's data points.
The first area of consideration
was "Software Installation". While the study found that it took
half as much time to install SQL Server (12.7 minutes for SQL
Server and the Service Pack -vs- 20.2 minutes to install 10g)
the test result was 0% difference in workday savings. They were
able to reach this finding only after factoring in the
installation of a second instance. Looks like adding a second
instance to Oracle is easier than for SQL Server, but my reality
is I do not use more than one instance of SQL Server 2000 on a
production box. Furthermore, I usually budget 1.5 hours for the
installation of a SQL Server. That time includes verification
that the install did not have problems (see my recent article
SQL Server
Informational Output Files Every DBA Should Know About),
installing SQL Litespeed (see my 5 part series
Heavy Duty LiteSpeed
Log Shipping), installing
SMTP email capabilities (see my 4 part series that begins on
SMTP:
Take My SQLMail,
Please!), and setting up
my automated processes for performance monitoring (see my three
part series on monitoring:
SQL Darwinism- On
SQL Server Baselines, Metrics Collection and Trend Analysis)
While, the study finding is ominous, for me it clearly doesn't
match reality.
The next area of
consideration was "Day to Day Database Administration Tasks."
The result was pretty close with a slight advantage going to
Oracle, but I noted that SQL Server lost some points because
they used separate steps to create a file group and a file.
Guess they didn't know you could specify a new file group name
when creating a new database file in SQL Server's Enterprise
Manager even though the vendor claims that SQL Server had always
been their SMB database of choice in the past. Also appears that
they aren't aware of the index creation wizard in SQL Server's
Enterprise Manager since, they didn't use it, and as a result
Oracle came out barely ahead on that task given the test
"rules". This is doubly suspicious because the testing criteria
was weighted to the advantage of a wizard in terms of
complexity. This advantage for wizards held throughout except in
a later case of recovering a dropped table when it was
advantageous to use the command line rather than the GUI to do
the recovery in the Oracle product. I suspect Oracle could have
still recovered the table faster, but the testers found an
Oracle weighted advantage by using the command line for this one
operation. I point this out now because in the "Day to Day..."
category the testers indicate that SQL Server had no automated
way to tell when a database is fragmented. Guess they weren't
familiar with DBCC SHOWCONTIG or the SQL AGENT? (see my article
Working with DBCC INDEXDEFRAG).
All the way through the test I
had questions about the methodologies the tester used for SQL
Server. I'll leave your analysis for you - and encourage you to
read the link to the comments of several other sswug.org folks -
for most and will skip to one last area. "Performance
Diagnostics and Tuning Tasks" where the most significant
differences were noted. Here SQL Server 2000 was a miserable
second in both "Diagnosing Performance Problem" and "Tune
Resource Intensive SQL". First let me say it looks like Oracle
has some nice stuff here and without knowing anything about the
Oracle offering, my first reaction is that Microsoft needs to do
some "me too" work in this area. (Go Ahead, Call me a fool)
Apparently all you have to do is go look at the "ADDM Report" to
identify poorly indexed queries and then click on a problem
query to create the necessary index support. Sounds cool! The
thing is, in my environment I need any new indexes to make it to
development and acceptance and to Source Control, not just be
created in production, so the GUI click may not be the
end-all-do-all that it seems. Also, though the test
documentation doesn't say for sure, it looks like the Oracle
Diagnostic and Tuning Pack doesn't come with the database but is
a separate application (separate purchase?). Is this application
needed to produce the ADDM report? dunno...
I think the bottom line is that
infomercials are too easily disguised as legitimate testing. No
matter if the "winner" in a particular showdown is Oracle,
Microsoft, DB2 or even - as happens only rarely so far - MySQL,
the worst outcome of such propaganda is that the boss sees the
articles, accepts the summarizations, presupposes the legitimacy
of the test, and sets his feelers on what a particular study
finds right with it's favored product and wrong with the
opponent. (I reminded of the day the boss wanted to move the
intranet to PHP/MySQL a while back in our all Microsoft shop
because "It's Free!") Perhaps it's all just different aspects of
the journalistic manipulation that we are told we are seeing in
the other areas of advertising and politics. (Gotta love those
conspiracy theories.)
Bill |