- SILICON IMAGE SIL3124 SOFTRAID 5 CONTROLLER UPDATE
- SILICON IMAGE SIL3124 SOFTRAID 5 CONTROLLER PROFESSIONAL
This delivered “workstation” numbers that differed very little from the server results returned by IOMeter. IOMeter lacks any facilities to simulate localized drive access- i.e., the tendency for a given piece of required data to be very close to the last piece of data accessed. On the surface, IOMeter’s highly-customizable nature seemed promising- tinkering with its settings yielded a pattern that we dubbed “workstation,” one that we believed could best represent single-user performance. Though we presented WinBench 99 results on Testbed2, our big focus was on testing with IOMeter.
SILICON IMAGE SIL3124 SOFTRAID 5 CONTROLLER PROFESSIONAL
We chose to go with Windows 2000 Professional and abandoned the Win9x core entirely since the former paired with the NTFS file system represented the future of desktop machines. Promise’s Ultra66 provided ATA-66 operability while Adaptec’s 29160 delivered Ultra160 SCSI compatibility. Thus, we stuck with a 700 MHz Pentium III paired with Intel’s tried-and-true 440BX chipset. Finally, it turned out that WinBench 99 was the last iteration of ZD’s venerable component-level benchmark. The i820 chipset suffered from delays and bugs… and Rambus memory, of course, never took off. Microsoft decided it needed more time to move their consumer operating system to the NT core and instead updated Windows 95 yet again in the form of Me.
SILICON IMAGE SIL3124 SOFTRAID 5 CONTROLLER UPDATE
Lastly, Testbed2 was to take advantage of ZD’s WinBench 2000 to update our single-user performance tests.Īll three of these updates failed to materialize. Next was Intel’s i820 chipset, the first chipset that would introduce Rambus memory, the RAM of the future. First was the impending introduction of Windows 2000, which, at the time, was heralded as the release that would unify Microsoft’s consumer (Win9x) and professional (NT) kernels. Testbed2 initially hinged upon three key factors. Let us take a closer look at three 500-gigabyte units that squarely aim to seize the burgeoning nearline enterprise sector where cost and capacity rather than sheer IOps drive the market. While leveraged from consumer-class SATA designs, these differentiated models undergo tests under different workloads, often enjoy longer factory burn-in cycles, are rated for longer mean times between failures, and are backed by a more business-oriented 5-year warranty. Seagate, Maxtor, and Western Digital have all entered the fray with SATA units specifically tuned for the enterprise sector. By then, it was clear that both the hardware and benchmarks required updating. Testbed1 nonetheless carried us through dozens of drive reviews spanning two years from March 1998 to March 2000. Looking back, however, its clear that the benchmark came up short.
![silicon image sil3124 softraid 5 controller silicon image sil3124 softraid 5 controller](http://img.auctiva.com/imgdata/1/8/2/5/1/4/1/webimg/765190172_o.jpg)
ThreadMark was our initial attempt to present multi-user performance. Windows 95 and NT 4.0 laid the foundation for our tests, ZD’s WinBench 98 and Adaptec’s ThreadMark 2.0.ĭespite its flaws, WinBench 98 truly was the best tool to measure single-user disk performance at the time. An Adaptec 2940U2W provided Ultra2 (80 MB/sec) SCSI functionality. Testbed1: Our initial testbed was a 440LX-based 266 MHz Pentium II machine featuring an ATA-33 controller operating off of the PIIX4 southbridge. This changed, however, with StorageReview’s debut of Testbed1. Though such articles occasionally featured one or two other drives tested in the same machine for comparison, by and large, it was difficult to directly compare contemporary drives with one another. Back then, when one could find them at all, hard drive reviews were always conducted on “the latest and greatest” machine that the individual reviewer could put together. When we launched StorageReview back in 1998, one of our principal goals was to maintain a consistent, unchanging test platform that would enable readers to directly compare a wide variety of drives with each other. Join us as we take a look at SR’s updated hard drive test suite and see how your favorite disk stacks up!Ī Brief History of StorageReview’s Testbeds Temperature assessment has been overhauled. Our third-generation Testbed has carried us for more than 3.5 years. One of StorageReview’s hallmarks has been our consistent testbeds that enable direct comparison of a wide variety of drives, not just those found within a given review.