Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Usenet -> c.d.o.misc -> Re: Microsoft destroys TPC-C records!
It proves which company has the better programmers. Perhaps benchmarks
aren't applicable to the average business owner or IT manager- but they
_sound_ impressive. And until sqlserver7 - what real competition did
Oracle have? Sybase? *guffaw* So, with this in mind, the benchmarks
keep the blood of creativity flowing - competition grows stiff, and
that _IS_ good for the consumer, because it gives Oracle reason to make
their product better, faster.
sonya
In article <38B3E9A6.90B0105_at_us.ibm.com>,
Larry Edelstein <lsedels_at_us.ibm.com> wrote:
> This is a perfect example of why you just can't use benchmarks as a
credible criteria (my opinion) to make a decision on a database.
Benchmarks are intended to demonstrate the raw processing power and
efficiency of a solution. When you start to do things like use
materialized views, you are not doing so. True ... you are using a
great feature of a db that has practical application. But to point to
benchmarks like this and say that they are a reason to support a
particular solution, is ridiculous. In this case,
> they don't even afford a valid basis for comparison between database
vendors (although on the other side, I suppose it also points to rel db
vendors who don't have features that others do). That is why TPC has
modified it's strategy and has initiated the new TPC H and R
benchmarks. To point to current "leaders" in the benchmark race and say
that they are the winners is shortsighted ... this is nothing more than
a contest the results of which will change over and over again during
the course of the year. And in
> addition, the workloads may not be truely representative of your
workload, so what does it prove?
>
> Norris wrote:
>
> > TPC-D history
> >
> > "Oracle Million Dollar Challenge," which Oracle CEO Larry Ellison
issued to Microsoft at Fall Comdex. Ellison said Oracle would pay $1
million to any person who could demonstrate that SQL Server 7.0 is not
at least 100 times slower than the fastest Oracle database when running
a query against a standard decision-support benchmark. That standard
benchmark, the TPC-D suite managed by the Transaction Processing
Performance Council (TPC), has long stood as the most commonly accepted
measure of decision-support
> > performance. (Another benchmark developed by the OLAP Council is
more specifically designed to measure OLAP engine performance.)
> >
> > Following the challenge, the SQL Server 7.0 team tried to poke
holes in it, noting, for instance, that the system on which Oracle had
set the record was a $9.66-million behemoth consisting of a 64-
processor Sun UltraEnterprise 10000 "Starfire" Server. Microsoft's main
counterattack, however, was directed against the TPC-D benchmark
itself. Although the benchmark was developed to measure different
systems' abilities to process complex, ad hoc queries, Microsoft
charged that Oracle and other vendors develope
> > d a way to crack the test. The database vendors knew the nature of
the questions the TPC-D benchmark would pose, Microsoft said, and they
used "materialized-view" techniques to pre-compute summary tables
containing the data the tests would request. This pre-computation,
Microsoft contends, significantly increased the loading time of the
databases, but the TPC-D benchmarks didn't measure the loading time,
only the execution time.
> >
> > Sour grapes? Many thought so, since SQL Server doesn't support
materialized views (though Microsoft will add that feature to the next
version of its database). However, it turns out that even the TPC
itself was having second thoughts about the value of its TPC-D
benchmarks. Performance times had dropped precipitously during 1998 for
the very reason Microsoft cited--vendors were pre-computing the
answers. As the TPC Administrator notes on the organization's Web site
(www.tpc.org), an effort now is underway
> > to break the TPC-D into two separate benchmark tests, one that
assumes pre-computation has occurred and one that gets back to the
original goal of measuring response time to truly ad hoc queries.
> >
> > Still, the SQL Server team wanted some way to prove its product's
performance, even if it couldn't use materialized views to answer the
challenge Oracle posed. In mid-March, the team announced it had decided
to use OLAP techniques rather than materialized views to run the
specific TPC-D query cited in the Oracle challenge. As opposed to
materialized views, which pre-compute summary tables based on advanced
knowledge of the nature of the likely queries, OLAP is more of a post-
load, on-the-fly technique to
> > compute such values. Working with partner Hewlett-Packard,
Microsoft constructed a 1TB OLAP cube based on the TPC-D query.
According to Microsoft, the resulting system matched or exceeded the
performance of the Oracle system, but cost only about one-twentieth as
much.
> >
> > http://webevents.broadcast.com/microsoft/gettingresults/summit.html
> >
> > In comp.databases.sybase Nicholas Dronen <ndronen_at_io.frii.com>
wrote:
> > > In comp.unix.aix Frank Hubeny <fhubeny_at_ntsource.com> wrote:
> > >> I heard about a half year ago that Oracle was offering a reward
of a
> > >> million dollars to anyone who could prove that SQLServer did not
run 100
> > >> times slower than Oracle.
> >
> > >> At the time I heard this, I suspected that SQLServer might be at
most 10
> > >> times slower, but the only way for Microsoft to win such a
challenge
> > >> would be to actually score faster than Oracle.
> >
> > > The crux of the challenge was a single, fairly complex SQL query,
not
> > > a vague notion like "this complex piece of software is 100 times
slower
> > > than this other equally complex piece of software." That is, the
test
> > > was of the capability of the database to handle a seemingly
difficult
> > > operation quickly. It was a test of the prudence of the data
structures
> > > and algorithms of the database software. That someone can put
together
> > > a cluster with three times the number of processors (which
interestingly
> > > doesn't even *double* the performance of the IBM S80) to make
things seem
> > > zippy doesn't change the unmet status of Oracle's original
challenge.
> >
> > > Regards,
> >
> > > Nicholas Dronen
> > > ndronen_at_frii.com
> >
> > --
> > JULY
>
>
Sent via Deja.com http://www.deja.com/
Before you buy.
Received on Wed Feb 23 2000 - 12:32:57 CST
![]() |
![]() |