Winstep

Software Technologies


 Winstep Forums


Print view
Board index : Winstep Forums : Articles  [ 19 posts ] Go to page Previous  1, 2
Author Message
 Post subject: Re: New Win11 system: Alder Lake 12900K, 64GB DDR5, Z690 Ext
PostPosted: Mon May 09, 2022 2:11 am 
Offline
Site Admin
Site Admin
User avatar

Joined: Thu Feb 26, 2004 8:30 pm
Posts: 11930
Things start going wrong

I started the build some time after lunch while there was still plenty of daylight and I planned on being finished before nightfall as the artificial light available on the living room is not that great - oh how wrong I was.

Working on two systems at the same time, one of them I was not familiar with at all, ended up taking many hours - but I pushed myself to keep going as I needed at least one of those systems up and running by the end of that day.

The first big problem occurred when I tried to replace the 2080 TI, that was temporarily installed on the Helios, with the much larger 3090. By that time it was already dark outside and I was very tired, having already been working on the two systems for hours.

My original plan was to install the 3090 vertically, and with the Helios you have two options: either closer to the motherboard or closer to the glass.

A typical problem on most cases when installing the GPU vertically is that they end up being with the fans too close to the glass, which greatly restricts airflow and increases their running temperatures. Since the Helios gives you two options, I obviously chose to install the 3090 next to the motherboard, except that when I tried it...

... it would not fit.

I could not push it far enough towards the motherboard in order to align it with the mounting brackets: it was hitting something that prevented it from going further.

Image

At first sight it looked like the problem was twofold: the 3090 is not only longer but also much taller than the 2080 TI, and the Helios was designed at a time when the largest GPUs around were still the 2080 TIs.

The Asus Z690 Extreme motherboard has that small OLED screen in the middle (you can see it just below the AIO) which also doubles up as the cover/heatsink for the PCIE 5.0 NVMe slot under it. Because the 3090 Strix is taller than a 2080 TI, the top of the 3090's backplate ends up banging against the very bottom part of the OLED screen (something that would not happen with a 2080 TI on this motherboard).

But this was not the only issue: just behind the card, there are two USB 3.0 cables that connect the USB connectors at the front of the case to the USB 3.x ports on the motherboard. This cables are pretty thick and rigid.

Unlike most motherboards, ports on the Z690 Extreme are horizontal (like it usually happens with SATA ports on all motherboards). One of the Helios USB 3.0 cables ended in a straight connector, but the other USB cable formed an L.

Image

Because of the unusual horizontal placement of the ports in the Z690 Extreme, the L connector ended up having the longest part of the L going up (towards the glass) instead of down AND it had also been mounted on the very first (topmost) USB connector on the motherboard, which happens to be positioned just under the GPU card.

This was fine with the much shorter 2080 TI which had enough clearance when installed horizontally, but it would hit the bottom of the larger 3090 EVEN when the card was installed horizontally! Because these cables are VERY stiff, I could not force the card down towards the cable either, at least not without breaking something.

This meant that while that L shaped connector was attached to that first USB port, I could not install the 3090 NOT EVEN horizontally. I played with installing the 3090 vertically in that position closer to the glass but not only would that mean higher temperatures as I didn't like how flimsy it looked without support at the back of the card to prevent GPU sag.

I now had to swap the two cables anyway so that the one with the straight connector went just beneath the card, giving it lots of clearance. Keep in mind it was now late at night, light was deficient, I was really tired and frustrated, and my body ached due to the weird positions I had been engaging in all afternoon in order to put the two systems back together.

Perfect recipe for disaster.

Image

Ideally I would remove the Helios cable cover/GPU anti-sag mechanism to have better access to the motherboard ports, but I was so tired I couldn't even locate the screws that would allow me to do this behind all those cables at the back of the case, so I pressed on regardless.

I unplugged both USB 3.0 cables, connected the L shaped cable to the bottom port with ease, but when I went to plug the straight cable to the top port I couldn't do it, it wouldn't fit not even after wiggling it around a bit!

Not thinking correctly, I then did the worst thing I could possible do: I unplugged the L shaped cable and tried to plug the straight cable back into its original port. And again I couldn't!

By the time I realized what I had done and why, it was already too late: compare the size of the holes in the L shaped connector to the size of the holes in the straight connector that is shown in the previous photo. See how much larger they are? That's why it was so much easier to insert the L cable.

Image

The smaller holes in the straight cable meant that inserting it would be much harder as it had to go straight in, and given the slight angle I was doing it from due to not having removed the cable cover, what ended up happening was me bending the pins on BOTH USB 3.x ports of the motherboard:

Image

With a sinking feeling in my stomach, I left the two cables unplugged (effectively disabling all the USB ports at the front of the case), installed the 3090 horizontally and called it a day, before I managed to make a bad situation even worse (I later read that those USB pins are very, very, easy to break and if that happens then it truly is game over for those ports).

Moved the Helios to the office, plugged it in for the first time after all the changes I had made to it, and prayed it would boot.

Image

It did. Checked to make sure everything was working ok and that no drives were missing, and crashed into bed.

Ah, and in case you are wondering what that white thing I installed on the cable cover is (where normally the Helios fan + RGB hub would go), that is actually an outdoor thermometer. I use it to measure the temperature inside the case vs. ambient temperature (it communicates wirelessly with the main indoor thermometer on my desk). Pretty useful to know how efficient your case cooling actually is when your system is under load.

The following day I ordered from Amazon a GPU holder (so I could install the 3090 vertically close to the glass in a less flimsy way and see for myself how much that would affect temperatures) plus some USB 3.0 extension cables (so I could try and fix the issue with the motherboard's USB ports) and the plan was to revisit the whole setup soon after.

That's when covid threw a wrench into my carefully laid out plans. In the famous words of John Lennon, "Life... is what happens to you while you're busy making other plans". :)

_________________
Jorge Coelho
Winstep Xtreme - Xtreme Power!
http://www.winstep.net - Winstep Software Technologies


Back to top
 Profile WWW 
 
 Post subject: Re: New Win11 system: Alder Lake 12900K, 64GB DDR5, Z690 Ext
PostPosted: Wed May 11, 2022 6:26 pm 
Offline
Site Admin
Site Admin
User avatar

Joined: Thu Feb 26, 2004 8:30 pm
Posts: 11930
Fixing things

Covid itself was pretty uneventful with very low fever and a sore throat (thanks to the Omicron variant being a lot milder than the Delta variant) but two weeks after the onset of symptoms I ended up developing a pulmonary embolism as a complication of covid mixed with prolonged physical inactivity. This basically put me out of commission for the entire month of February.

Anyway, once I was back on my feet and feeling good again, I went back to work on the new system (by now I had already replaced the 32GB of ADATA XPG DDR5 RAM with the 64GB of Corsair Dominator Platinum DIMMS).

I wanted to move the T-FORCE SATA SSD from the back of the case (where it had little to no airflow) to the front of the case (using double sided tape if necessary) and I still hadn't given up on the idea of putting the 3090 in a vertical orientation.

This time I removed the Helios cable cover/GPU holder, and figured out I could actually screw either the left or the right side of the SATA SSD to the holes originally intended for the Asus fan/RGB Hub, so that all that nice RGB would not go to waste. :D

I also removed the white outdoor thermometer I had double sided taped to the cover and moved it to the PSU cover instead.

Image

With a needle and lots of patience I slowly straightened the bent pins on the two motherboard USB 3.x connectors. With the cable cover removed I was now able to plug the straight cable directly to the motherboard without bending any pins, but the L shaped cable still interfered with the GPU despite being connected to the second (lower) port. This is where one of the USB 3.0 extender cables I had ordered from Amazon still came in handy.

Image

Before trying the outmost position for the 3090, I decided to try placing it closer to the motherboard again. Thanks to the extender the L shaped cable was no longer interfering and I could always try removing the small OLED screen that doubled up as a NVMe drive cover.

Much to my surprise, that ended up not being necessary. The top edge of the 3090 still rubbed against it, but the real problem had been the L shaped cable all along - I was now able to have the best of both worlds, a vertical GPU but without the higher temperatures that come with it being too close to the window. The GPU holder I got from Amazon in case I decided to install the GPU closer to the glass never even got out of the box.

But then I (of course) ran into another problem: the (very expensive) 25 cm version of the PCI 4.0 LINKUP GPU riser cable, although PERFECT for the position closest to the glass, was now too long to fit snugly behind the 3090 without some seriously tight bends (a no-no for riser cables). I ended up having to order the 15cm version instead (but kept the 25 cm version just in case).

The old PCI 3.0 riser cables will not function properly with the newer PCI 4.0 graphic cards, and because PCI 4.0 riser cables must be of much higher quality to preserve signal integrity at PCI 4.0 speeds, they are also a lot more expensive than their 3.0 brethren.

The CableMod fiasco

On the C700M I was using black & white Cablemod cables to match the overall black & white theme.

Image

These cables were not extensions, but full Seasonic compatible power cables I had ordered from Cablemod back when I still had the 2080 TI and GPUs only required two PCI-E 8 pin cables.

With the 3090 I ended up buying a matching pack of Cablemod extension cables rather than having to wait God knows how long for a new PCI-E 8 pin cable directly from Cablemod (they take a long time to ship custom cables AND the shipping prices would have made that single cable even more expensive than the whole extension pack).

This turned out to be somewhat of a mistake, as power extensions actually reduce the quality of power being delivered (perhaps due to the higher resistance/longer distance power needs to travel).

You see, according to Asus, to fulfill the power demands of the GeForce RTX 3090, three 8-pin power connectors are present, along with an onboard circuit that monitors PSU rail voltage. The circuit is fast enough to catch any transients that result in the rail voltage dropping too low. If that happens, a red LED will light up to indicate a power supply issue.

Sometimes while playing a demanding game the LED of the connector with the extension cable would start blinking red (but not the others). This never resulted in any (visible) issues, but it proved how using an extension cable can actually degrade the quality of power being delivered to the card.

Anyway, when I moved the 3090 to the Helios and the 2080 TI back to the C700M, I didn't want to remove those nice Cablemod cables from the C700, so I decided to order some more.

Alas, the Cablemod site said they were experiencing weeks long delays due to restructuring etc, so I ended up ordering a Seasonic-compatible Cablemod kit from Amazon (faster and cheaper).

I failed to read the reviews and this turned out to be another mistake: despite how long the 30 series GPUs have been out, these kits are still designed for the 20 series - the 3rd 8-pin PCIE cable is missing (although you wouldn't notice that just by looking at the pictures for that kit).

So, I was back to using an extension cable for one of the 3090 power connectors. Worse, with the three separate cables plus having to fit them all through the top aperture on the PSU cover, I couldn't get them to align properly and the result looked really, really, bad:

Image

I still tried routing the cables behind the card but, since the ModMesh cables are very thick, that turned out to be a no-go (not to mention they would be mostly hidden behind the card, so what was the point?).

I ended up completely giving up on the idea of using the Cablemod cables and went back to the black cables that came with the ROG Thor.

Setting up the all important RGB

Beauty is, as always, in the eye of the beholder, but being the creator of Windows customization software I am - not surprisingly - a fan of a few lights here and there (although NOT so much rainbow puke).

Besides, we all know RGB makes your system 50% faster. :D

The Helios case, the Strix 3090 and the Asus Z690 Extreme motherboard are, of course, Aura compatible. The problem is that Asus seems determined to piss everyone off by bundling great hardware with utterly trash software.

The older Win32 stand-alone version of Aura Sync has been deprecated in favor the monolithic Armoury Crate and can no longer be used to control the latest 30 series of Asus graphic cards. Even when it was still available - despite Asus having had YEARS to develop the damn thing - it still could not do BASIC tasks such as having different RGB effects for different hardware. The best it could do was display different SOLID colors for each motherboard zone, RGB header, etc...

With Armoury Crate Asus is trying to unify EVERYTHING into a single piece of software (adding everything to it plus the kitchen sink). Personally I think this is a HUGE mistake, especially given that Armoury Crate is a UWP app. It's slow, EXTREMELY bloated, inefficient, consumes a TON of memory - I've seen one of the 6 (!!!) different asus_framework.exe instances that are currently running on my system use up to 1.5GB of RAM! - and literally spreads out into your system like cancer.

To give you an idea, Armoury Crate is not even open but despite this I have AT LEAST 25 (!!!) Asus/Armoury Crate related processes and services currently running on my system (including 5 instances of ArmouryWebBrowserEdge, 6 instances of ASUS NodeJS Web Framework, plus 32 bit and 64 bit versions of AAcKingstonDramHAL when I don't even have Kingston RAM installed on my system).

There is no standalone streamlined Win32 application to control just my Asus Ryujin II AIO either. If I want to control it I have no choice but to install Armoury Crate. Likewise if I just want to change some colors.

This is so bad that to uninstall Armoury Crate you MUST run their own standalone uninstaller utility, which, to make matters worse, seems to be version specific. Fail to do this, try to uninstall Armoury Crate as you would any normal application via the Windows Apps & Features page in Windows Settings, and you might as well get ready to reformat your system: it will NEVER work right again after that.

And to make things even more laughable, Aura Sync in Armoury Crate has even LESS functionality than the old Win32 Aura Sync: now EVERYTHING must display the same color and effect (except some devices like the Strix 3090 that allow you to optionally control them individually, but those are the exception and not the rule).

Image

You can no longer distinguish between different motherboard zones, individually control different RGB and ARGB headers on the motherboard, etc... In other words, not only does Asus now force you to install a crap ton of bloatware into your system, as that software has EVEN less functionality than the - already lacking in BASIC functionality - old Win32 Aura Sync.

Plus it keeps FORCING you to update it, and the update often ends up breaks existing functionality that was previously working fine - and it does not let you go back to the previous version. A total mess.

I don't remember ever seeing a piece of software so universally disliked. But does Asus listen to their customers? Nah. Of course not.

As I mentioned a thousand times previously, in my experience the only RGB controlling software that is actually worth installing on your system is Corsair's iCue (and no, they are not sponsoring this :D ).

With iCue each different piece of hardware can be controlled individually and you can mix and match colors and effects to your heart's content, even add them on top of each other. Not only that, and unlike what happens with Armoury Crate whenever you have anything more complicated than a solid color, CPU usage is pretty much negligible, even when using extremely complex RGB effects.

Image

There is one problem though: it only works with proprietary Corsair hardware (and yes, I am aware of the Asus plug-in, but that requires Armoury Crate to be installed, does not support ARGB headers and the plugin often breaks when Armoury Crate updates itself).

The fact that iCue is very near perfect is one of the reasons why I wanted Corsair DDR5 RAM, so I could control it with iCue.

What you may or may not know is that there is actually a way to make Aura Sync compatible hardware (such as led strips, fans, even the ROG Thor PSUs and the Helios case itself, etc...) work with iCue.

In order to do that you first need to buy an Aura (3 pin ARGB) to iCue cable adapter like this one HERE.

Instead of connecting the Aura compatible hardware directly to the motherboard ARGB header, you connect it to the adapter cable. The other end of the cable then connects to a Corsair Lightning Node Pro, like this:

Image

In the Helios, Armoury Crate would control the motherboard, the AIO and the Strix 3090 (the latter two can actually be controlled individually, but not the motherboard ARGB headers and built-in LED strip, which I was going to set to neutral white).

My idea was to use iCue to control the Corsair Dominator Platinum RAM, the Corsair LL140 fan at the back of the case and a couple of Corsair LED strips (plus my Corsair K95 keyboard and Polaris mousepad, of course). Furthermore, with the cable adapters I wanted to see if it was possible to also use iCue to control the RGB in the ROG Thor PSU, the T-Force SATA SSD, and even the Helios case itself!

I ordered 4 of these adapter cables (or two packages, each package coming with two cables). Note, however, that the Lightning Node Pro is obviously NOT included, only the cables themselves.

Unfortunately each Lightning Node Pro only has two ports. I was already using one Lightning Node Pro to control the LL140 fan at the back of the case and the Corsair LED strips. To control the rest I was going to need more Lightning Node Pros - and since iCue controls each of these via a USB 2.0 header on the motherboard and the Z690 only has two which were already in use, I was also going to need a USB 2.0 hub.

I ended up ordering a Nzxt Internal USB 2.0 Hub from Amazon:

Image

Please note that there are a TON of cheaper USB 2.0 splitters available on Amazon, but most of these apparently do not share all signal lines. As a result, the hardware will light up but iCue will not recognize that a Lightning Node Pro device is connected to it. The NZXT hub already does the job properly, plus it comes with small rubber caps to cover non-used ports, so that the metal pins are not exposed.

With all these hubs, fan controllers, lightning node pros, etc, cable management at the back of the case ended up suffering greatly :D (and you can't even see the two other Lightning Node Pros plus the Corsair fan hub for the LL140 that are tucked away deep inside the PSU compartment eheh):

Image

So many hubs and controllers, in fact, each requiring its own independent power, that I ran out of SATA power connectors and had to order a couple of SATA Y splitter cables. :D

Aura Sync vs. Corsair iCue

So, since I want the led strip alongside the Z690 motherboard to be white and in Armoury Crate I cannot individually control the ROG Thor PSU, this would result in an all white ROG logo (as you can see in the photo below taken when I was still using the black & white cablemod cables).

Image

If you leave the Thor's RGB cable unconnected, the Thor defaults to solid red (look around for videos displaying systems with the ROG Thor and this is what you will see mostly). If you go to the ROG Thor product page on the Asus website, however, you will notice that the diagonal stripe actually features a nice blue to red transition:
Image
I wanted to reproduce this, but without the ability to control individual RGB headers on the motherboard (much less address individual LEDs), you simply cannot do it using Armoury Crate.

Likewise for the pattern on the glass at the front of the Helios case: you either use one of the case's built-in RGB effects, or you get whatever effect/color is provided by Armoury Crate (which in this case would have been solid white). Most people default to using the color wave effect (see below), but I wanted something static based on white but not all solid white.

Image

Finally, if you look at the side photo of the Helios above where the ROG Thor PSU is white, you should notice that the cool letters and symbols on the side of the PSU cover are mostly not visible in the dark even with a LED strip just below it. This is because of the way the PSU cover attaches to the case and the spacing in between.

To try and work around this I ordered a custom made ARGB Helios PSU cover side panel from AliExpress. The Chinese have a TON of different panels for the Helios with lots of different patterns: I chose a pattern that only displayed the letters and symbols, like in the actual PSU cover.

Image

I was very surprised that it only took a week and a half from ordering to the panel arriving at my doorstep.

Most examples of these panels for the Helios on the net show them using rainbow puke, but all I wanted was the possibility of lighting up the letters and symbols in white, so they could actually be seen.

With three Lightning Node Pros (remember each Lighting Node Pro only has two ports) and the four Aura to iCue cable adapters, I could thus use iCue to control the LL140 fan at the back plus a series of Corsair LED strips going around the case with the first one, the ROG Thor and the Helios front panel with the second, and the side panel and the T-Force SATA SSD with the third.

With iCue allowing me to address leds individually, I managed to get the effect I wanted for the ROG Thor (photo doesn't do it justice though and the glass reflections don't help either; the red looks pinkish but I can assure you it's red in real life and the diagonal stripe begins with a bluish accent):

Image

As for the front panel, again addressing leds on an individual base I managed to get the bottom pattern white and the actual ROG logo red:

Image

Below you can see better the end result, with the side panel letters lit white. The panel is REALLY bright, so even though I set it to solid white I could not have used Armoury Crate, it would have been too bright and deviate all attention from the rest of the case! With iCue I managed to make it much darker and thus a bit more sober:

Image

Image

Unfortunately even with the Aura to iCue adapter cable I was not able to control the RGB in the T-FORCE SATA SSD. This is because it uses some kind of micro-USB cable to Aura connector instead of a simple Aura 3 pin ARGB cable, so it refuses to let itself be controlled by iCue.

In the end I prefer that weird mix of red and green it defaults to (guess the blue channel went belly up after being solid white for so long in the C700M) than a "white" that by now is actually a dark shade of puke yellow. As it is it kind of matches the colors of the AIO and the motherboard's Anime Matrix LED displays.

_________________
Jorge Coelho
Winstep Xtreme - Xtreme Power!
http://www.winstep.net - Winstep Software Technologies


Back to top
 Profile WWW 
 
 Post subject: Re: New Win11 system: Alder Lake 12900K, 64GB DDR5, Z690 Ext
PostPosted: Wed Dec 20, 2023 11:41 pm 
Offline
Site Admin
Site Admin
User avatar

Joined: Thu Feb 26, 2004 8:30 pm
Posts: 11930
Replacing the Asus Strix 3090 OC with a Strix 4090 OC

When nVidia released the 4090 back in October 2022, I was lucky enough - thanks to my usual hardware supplier - to get my hands on an Asus ROG Strix 4090 OC barely a month later. Not even going to comment on what we are paying for flagship GPUs these days. :shock:

Image

In anticipation I had already ordered from Cablemod a custom black & white 12VHPWR PCI-e Cable for ASUS and Seasonic PSUs (the 1200W ROG Thor is a Seasonic OEM) which got here before the GPU, but for various reasons it actually ended up being a while before I installed the 4090.

Initially I wanted to mount the 4090 vertically, like the 3090 before it, but I had some doubts on whether it would fit or not, despite the huge size of the Helios case (and the native vertical mounting mechanism). In fact, it did not, but not so much because of the size of the GPU.

Image

Turns out the major problem was the position of the 12VHPWR connector at the top of the card, it ended up right under the Corsair DDR5 RAM sticks. There was absolutely no way the CableMod 12VHPWR connector would fit between the GPU and the RAM, not without seriously bending it, and with all the reports at the time of burning 4090 connectors, not something I would even consider doing.

Using the alternative vertical mounting mechanism of the Helios with such a tick card was no option either, as it would then be sitting flush to the Helios side panel, preventing it from receiving proper airflow.

And so I ended up mounting it horizontally:

Image

Image

The new 4090 without any sort of overclock was roughly 55% + faster than my 3090. Not bad considering I game at 4K on a 120Hz OLED monitor (especially now with the new Unreal 5 games + ray tracing making a mockery of GPUs that are still considered top-of-the-line)

Image

Given all the reports of melting connectors and how expensive this card is, I decided to keep my 4090 nearly "stock", i.e.; consuming 450W at the most (the Strix can go up to 600W if you let it, but I bet that is one big reasons why connectors began burning up left and right). So I have a very modest 135 Mhz core overclock - core clock boosting to 2910 Mhz when running a game - and +500 Mhz for the DDR6 memory.

_________________
Jorge Coelho
Winstep Xtreme - Xtreme Power!
http://www.winstep.net - Winstep Software Technologies


Back to top
 Profile WWW 
 
 Post subject: Re: New Win11 system: Alder Lake 12900K, 64GB DDR5, Z690 Ext
PostPosted: Fri Dec 22, 2023 12:02 am 
Offline
Site Admin
Site Admin
User avatar

Joined: Thu Feb 26, 2004 8:30 pm
Posts: 11930
Storage changes:

So, in this system (Helios) I was using a 2TB Seagate Firecuda 530 as the system drive, but despite it being Gen4 I was not extremely happy with it.

For instance, for some reason copying a 20 TB Outlook.pst file from the Firecuda to another SSD would reveal drastic fluctuations in transfer rate, and apparently not because the Firecuda was over-heating (unless the temp sensor is bad or Seagate is lying about the temperatures). At 35C idle the Firecuda is usually the coolest of all the nVME drives on my system.

Nothing to do with the other drives either as the same thing would happen regardless of which drive I copied to.

Besides this, I had the 960GB Intel 905P Optane PCIe drive sitting idle in my previous C700M system, and that is just a HUGE waste of an Optane drive. Despite being only Gen3, the Intel 905P is still unbeatable in terms of IOPS and 4K reads, not to mention write endurance.

Image

For a quick comparison of the drives in my system, the 905P and the two 970 EVO Plus are Gen3, the Firecuda and the 980 Pro are Gen4 and the T-Force is a SATA SSD. Despite being Gen3, as you can see nothing yet gets close to the Optane in terms of 4k Q1T1 reads (which is what makes the system feel "snappy").

Sequential transfer rates look nice for marketing purposes, but most of the time the system will be doing random 4K transfers, not copying huge sequential files. Not only that, but that huge write transfer speed in ALL of the drives above will fall of a cliff once the SLC/DRAM cache becomes exhausted - except the Optane. The Optane will keep writing at 2500 MB/s until all hell freezes over.

I thus decided to make the Optane my new main system drive in the Helios.

Since the Optane is a PCIe drive and the Asus Z690 Maximus Extreme only has two PCIe x16 slots (both connected directly to the CPU) this meant that the 4090 GPU would have to share bandwidth with the Optane, going from full PCIe x16 4.0 to PCIe x8 4.0. Not a biggie at all since we are talking PCIe 4.0 speeds here, 8x is equivalent to full 16x speed on a PCIe 3.0 system.

Image

I was also running out of space for games on my two 2TB 970 EVO drives. This way I could re-use the Firecuda as an extra 2TB drive dedicated to games.

So, I ordered a 1TB 980 Pro, cloned the Optane into it, removed the 905P once the process was finished and the 1TB 980 Pro became the new main system drive in the C700M system.

Now, making the Optane drive my new main system drive in the Helios was going to be a bit more complicated than that. First I had to clone the 2TB Firecuda into the smaller 960GB Optane drive (shouldn't be much of a problem because actual data would still fit inside the 960 GB Optane, but this would mean resizing and moving partitions, etc) then I would have to remove the Firecuda BEFORE booting the system again, otherwise Windows might become confused with two drives sharing the same ID/signature, with catastrophic results.

A clone is an identical copy of one drive to a second drive, which includes duplicating the source disk signature on the target drive.

When cloning, it is always recommended to shutdown after the operation completes, then remove the source drive and replace it with the target drive before attempting to boot into Windows. Booting into Windows with two drives holding the same disk signature can cause some very bad things to happen (among other things it can result in BOTH the source and target disks becoming unbootable, ask me how I know lol) so it should always be avoided.

So, basically the process consisted of: installing the Optane, cloning the Firecuda to the Optane using an Acronis Recovery disk, removing the Optane and the 4090 to have access to the Firecuda below the motherboard shield, removing the Firecuda, re-installing the 4090 and the Optane so I could make sure the system was still booting into Windows, installing the Firecuda into an external NVMe to USB enclosure (I used a ROG Strix Arion I have here) so I could quickly re-partition and format it, remove the Optane and the 4090 again to re-install the (now empty) Firecuda and finally put the 4090 and the Optane back into the system.

Uff! I get tired just talking about the whole process :D The whole thing would be a lot easier and much quicker if you could simply disable individual M.2 slots in the BIOS, as you can SATA ports.

So at this point the system in terms of storage consisted of:

1. 960GB Intel Optane 905P PCIe NVMe
2. 2 TB Samsung Evo 970
3. 2 TB Samsung Evo 970 (#2), both NVMe EVO drives installed on the DIMM.2
4. 2 TB Firecuda 530
5. 1 TB T-Force SATA SSD
6. 4 TB Western Digital hard disk to store old backups, etc.

So basically 8 TB of sweet, sweet, flash storage.

But things would not stop here. Oh no. I still hadn't given up on the idea of getting that GPU vertical, but it would take a few more months before I did anything about it.

_________________
Jorge Coelho
Winstep Xtreme - Xtreme Power!
http://www.winstep.net - Winstep Software Technologies


Back to top
 Profile WWW 
 
Post new topic Reply to topic Board index : Winstep Forums : Articles  [ 19 posts ] Go to page Previous  1, 2
Display posts from previous:  Sort by  

Who is online

Users browsing this forum: No registered users and 6 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
cron