For years I have said, give me a chart, a stopwatch and a compass and I’ll navigate anywhere – whether it’s in a plane or on a boat. As I get older, the bravado wanes and the complexities of (safe) navigation increase. We now have to watch out for an ever-increasing number of ships on passage, debris and flotsam in the oceans and not to mention rocks in the shallows as we try to navigate the most optimum (fastest) routes from A to B.
After much soul searching and due consideration, I have decided that the current chart plotter and nav system on Pamela C is woefully inadequate for the planned voyages ahead. Safety is paramount and having up to date electronic charts, access to ship’s metrics, a working and accurate autohelm (and more) is vital if we are to survive some of the passages that are planned.
Just having a vaguely accurate Speed over Ground (SoG), a Course to Steer (CTS) that may only be partially correct depending on how accurate the onboard compass is, and an autohelm that only seems to work when it wants to, with charts that (in the cockpit at least) are woefully lacking, just isn’t enough any more.
With this and other technical requirements in mind, I have opted for the B&G Zeus 3 system. It seems to be head and shoulders above the rest as far as quality and features go, and whilst I could have waited another year (or 3) for B&G to approach and offer to make me an ambassador with all the free gadgets that this seemingly entails, I felt it prudent to go for the upgrade now (and pay for it out of my own pocket) rather than wait on the chance of winning a lottery! 🙂
I have played with Savvy Navvy and Navionics on a tablet, and whilst they are great for planning and do help with situational awareness, there are too many times when things can go wrong, and when they only work when you have internet connectivity.
As such, I have bitten the bullet and ordered a new Zeus 3S chart plotter, Precision 9 compass and a new HALO 20+ Radar for Pamela C, along with new transducers that monitor speed, depth and water temperature and integrate together with my AIS, Radio and other components using an NMEA2K network – something I have already been busy installing and upgrading.
This of course means yet more expenditure and still relatively minimal income. If you haven’t yet considered becoming a Patron, now would be a great time to sign up! If you aren’t ready to make a regular monthly commitment then you can make a one-off donation/contribution through PayPal. Every penny helps, and as a Patron, you will receive regular updates, entries into our regular crew draw, the winner of which gets invited to spend a week with us on board somewhere in the world and more!
One of the things you need when sailing a boat, is a smaller boat to get you to and from shore. Yes, you can just take your big sailboat into a marina and pay £20-£50 a day for the privilege (ok, so the South Coast is more expensive than the North).
The tender is your car, for all intents and purposes. It is the vehicle you use most often to get to the beach, to go shopping, go visit neighbours and more. Traditionally your tender has been powered either by human muscle (oars) or a small petrol-driven outboard (either 2-stroke or 4-stroke). In today’s greener world though, there are other options worth considering. Electric powered engines are really beginning to hold their own, and whilst still more expensive than petrol, they are coming down in price and becoming more and more affordable.
I recently started a thread on one of the Facebook sailing forums asking for input from the veteran members as to whether or not I should buy an electric outboard or go for a (used) petrol engine. The feedback I received was both insightful and also somewhat surprising.
One of the biggest comments was about re-charging your batteries. With a petrol outboard, you just stick a few litres of petrol in a can, pull the chord and off you go. With an electric engine, you have to consider the fact that if the battery is flat, you need to wait up to 12 hours to recharge it on your boat – and then you may only get 1-hours of usage out of it (less if you run it at full power). Batteries (like the one pictured above) aren’t cheap. The ePropulsion Spirit Battery costs £800 to buy, not to mention the £1,650 for the motor itself. Tests I’ve seen so far seem to confirm the 6-8 hour usage scenario from a single charge, but also show a full charge taking closer to 24 hours than 8 hours as advertised (when charged from 12 volt DC systems, which is the norm here).
It floats (which is good) and is waterproof (essential) but takes 8.5 hours to charge, which means you probably need to keep at least one in reserve on charge at all times, and possibly carry a spare fully charged one on your boat.
When you’re living off the hook, you have to consider every amp-hour of battery usage on your boat, and turning the engine on to recharge the house batteries is less than optimal. You can install solar power, but you’re likely to only get 30-40 amps of charge out of a 600W solar array in a 24 hour period, and space is at a premium so you may not even be able to host 600W of panels.
So, considering how green everyone is generally in the sailing community, I was somewhat surprised to find that the general consensus was to get a 4-stroke engine and continue to use that for as long as possible.
The downsides of petrol engines are reliability and weight. 2-stroke engines are generally just not reliable enough and 4-stroke engines are heavy.
What makes this whole process harder is the UK Government’s recent switch to E10 fuel. Not all (old) outboard engines will run on E10, and whilst E5 will remain in circulation for some time to come, it is going to become more and more expensive (I recently paid £1.52/litre for E5 at a forecourt) as supplies are wound down.
There were the usual debates about the damage to the environment caused by the manufacture of solar panels and lithium-ion (LiFePO4) batteries, etc.
The manufacturer of ePropulsion engines boasts some interesting usage figures compared to a petrol outboard, the question is though will this new generation of electric outboard actually convince people to switch?
One of the problems with using ethanol based fuels is that ethanol attracts water, making it particularly unsuitable for marine use as the water can reach the carburettor and reduce performance and increase fuel consumption.
One of the issues that I have noticed reading reviews of the ePropulsion outboard is that it seems to be, like an iPhone, designed so that there is 0 maintenance and 0 repairability. Owners have reported issues with broken battery cases when they dropped the battery onto a wooden deck from a foot or so above and the case cracked, when they contacted the manufacturer for spares/repairs they were simply told to buy a new battery – and in fact a new engine too as there was a newer version out. The ePropulsion motor comes with a 2 year warranty, and nothing much more after that – is that planned obsolescence?
Compare this to a 2-stroke or a 4-stroke engine which, although requiring annual maintenance, is infinitely more repairable and petrol engines have been known to run for 10-20 years without needing to be replaced or being “obsolete”.
It seems the steamroller switch to electric is inevitable, and no matter what the option the costs (to the environment as well as my pocket) are likely to be steep, but in the short term I think I may well stick with a used, 10 year old, outboard and deal with the hassles of maintenance for the next 5+ years while I wait for the electric outboard engines to become considerably more mainstream, or be replaced with hydrogen-powered engines instead!
It’s been a while since the most expensive chair I ever bought has arrived, and I have to admit it was worth the money!
I have had a bad back for years, made worse by a stupid accident at work which resulted in me being laid up for the best part of 3 months with torn ligaments.
Suffice it to say, proper posture and a comfortable seat is important to me – so much so in fact that at Raindance I’ve actually had to go as far as getting a proper chair to sit in. This was a “cheap” £75 chair from Amazon, and it was the same as one I’d had at home for a year or so. Comfortable, but it didn’t stand the test of time and the arms broke off it not once, but twice!
I finally decided it was time to take the plunge as I’d heard such good things about Secret Lab chairs …. Turns out, they weren’t lying!
The chair took forever to arrive, but once it did it was obvious why it costs so much money. The build quality was like nothing I have ever seen before! Putting it together wasn’t too difficult and within half an hour or so I was sat in my new throne.
Comfort, posture, adjustable everything – this has it all
Having pulled the trigger and finally decided to move to Windows from OSX, I thought I’d list a few things (in no particular order) that I love / hate about the move. What am I missing and what has blown my mind.
First off, Windows Hello. This is FaceID for the desktop and is awesome! You need a camera that supports 3D scanning such as the Logitech Brio 4K, which isn’t cheap, but seeing as though I’m always mistyping passwords and forgetting PIN numbers, this seemed like a really nice upgrade from my old Logitech C920. And, thanks to COVID, I’ve managed to sell the C920 for about the same as I paid for it 2 years ago, so, all in all, it’s a win-win!.
Next, performance. This machine is quite literally blisteringly fast! I have never seen Cinebench run that fast.
Windows does try to be a little clever at times and each time it detects a change on the video outputs (I either switch the monitor input to another computer, or even the screen just goes to sleep) Windows was randomly rescanning speaker and microphones and assigning the HTC Vive headset priority instead of the Yeti microphone and Bose USB speakers. I have now, finally (I hope), found a fix for this and have Steam VR playing through the VR headset when I want it to, without impacting my day to day enjoyment of Spotify / etc (whether there’s a monitor attached to the PC or not – while I’m decommissioning my old Macs)
I was amazed at how good Cortana actually is. Probably not quite as good as Alexa, but light years ahead of Siri. Siri is sadly another example of something that Apple innovated well with and then dropped the ball. While Amazon (and Microsoft) seem to have continued to develop the underlying technology and making the AI more intelligent and responsive, Apple just seem to have focussed on making Siri sound better (debatable) without actually working on the AI or recognition factors.
Edge / Internet Explorer / etc have always sucked. They’ve just really been bad. Microsoft seem to have finally come to terms with this and have redesigned/rewritten their web browser, based on Chromium. Yes you heard it, Microsoft Internet Explorer is now Google Chrome with a Windows Logo on it. Well almost. Annoyingly (for me) they’ve also replaced the Google sync sign in with sign in to Microsoft Live. Not augmented it so we can have either or, they’ve just replaced it. This means I’m not really getting to use Edge as much as I could because of the way Microsoft consume everything it touches, and the fact that I’m already consumed by Google (at this stage at least) and to a lesser extent Fido (which I owned and ran for 20+ years before finally hanging up my Director’s braces and donning my film makers baseball cap instead!).
Where to start! There are literally thousands of games available for Windows and new top mark games appearing almost daily. Whether it’s the latest release of Doom, Tomb Raider, Assassins Creed, Diablo from Blizzard, Call of Duty, Counter-Strike, the list is almost endless!
My personal favourite game (since 1984) has been Elite. Obviously I’m not still playing the 1984 BBC micro version today (although I still can through an emulator!), Elite Dangerous released in 2014 and now Elite Dangerous: Horizons are awesome space
The VR gaming experience is amazing. I’m still blown away by how realistic it can feel.
VR as a whole is still only just coming into its own. May 2020 and we’re still in lockdown, however, I was able to “get out” and walk around Stonehenge, go for a parachute jump out of an aeroplane and even walk around art galleries in London, all without actually leaving my house. Having been staring at the same 4 walls for over a month now, I needed the escapism – and through the world of VR, it was actually very realistic. I’ve also found a world of VR content online with short films and interactive experiences that I’m working through, some of which are incredibly realistic and make you feel as though you are totally there in the room and in the moment!
How Easy is it to switch?
Ok, so the UI is different and takes a little getting used to, but if you can navigate OSX then you can navigate Windows 10 – although the window toggles are on the right not the left.
Applications – do I miss them?
Look at this list of apps … as you will see, most if not all of them (apart from FCPX) have native Windows versions.
To be honest, the only real deal breaker might have been the loss of iMessage. That said, millions of Android users and Windows users have survived without it and I still use Telegram and Facebook Messenger every day, and WhatsApp if I have to. I still have iMessage on my Phone and Tablet … and I’m not dropping those, yet a while anyway! (Note, I did try 2 years ago and bought a Samsung Galaxy S10. I lasted a week and had to send the phone back, I just couldn’t get on with Android – or the fact that apps constantly crashed)
Subtle UI differences
There are some subtle UI differences. For years I’ve been able to drag a file from a folder to a dialogue box to change the directory that the dialogue box is searching in to open a file. On Windows, if you drag the file from a folder to the dialogue box it actually moves the file into the folder that the dialogue box has open and doesn’t change the directory. Not a major issue, but something I got caught out on initially.
Likewise, I have been used to clicking on the icon at the top of a window to copy/move a file, this doesn’t work on Windows. I need to do a “save as” instead – big deal
Backups. Time machine has been amazing. I’ve rarely ever needed to use it, but I have done a full bare metal restore using it twice now I think. Windows has snapshots, but I think I might need to “buy” a 3rd party backup tool (thankfully I have the QNAP and that has backup tech built in)
Last week I wrote a brain dump on why I finally decided, after almost 20 years of being an Apple fan boy, that it was time to move “back” to Windows and PC as a desktop.
The benchmarks and bank balance tell a good part of the story, a machine which is 30-60% faster and yet roughly 1/3 the price of the Apple equivalent, and that’s with me throwing in a few frivolous extras like RGB lighting and an “Elite” case when a basic case would have done.
My daily editing work-flow has been based around Davinci Resolve for a while now. I used to be a devout Final Cut Editor, from “back in the day” with Final Cut Studio and iDVD, all the way through the trials and tribulations of Final Cut Pro X. There was a time when I used to edit in Adobe Premiere, however when I decided to move to Apple in 2000, Adobe annoyed me by saying that I would have to re-buy their entire suite of tools if I wanted to use them on the Mac, my Windows licenses were not transferable … so I did the big “screw you Adobe” and bought Final Cut instead (for over £1,000 at the time).
Roll on 2020 and the world of lockdown and quarantine. I’ve had a lot of spare time on my hands the last few weeks and finally decided that it was time to pull the trigger and move to Windows. I had been experimenting with Windows 10 in a virtual machine on my Mac Desktop, and have been supporting Windows 201x server installations for years as part of my $dayjob. Now was the time to jump in with both feet.
Rather than build a “toe in the water” build, (Ryzen 5 (1600 AF), 16GB RAM and an RTX 2060 Super) which would potentially cost me about £1,200, I decided to jump in with both feet and instead splurge on the Ryzen 9 3950X processor, 64GB of RAM and an RTX 2080Ti GPU. The total build cost comes in at about £3,500 with VAT, X570 motherboard, case, power supply and ultra-fast NVMe storage.
Why did I choose the AS Rock X570 Creator motherboard instead of a cheaper but still viable motherboard? Well, a year ago I migrated all my storage onto a new QNAP TS-932X SAN with dual 10 Gig NICs and spent a small fortune on 10 Gig switches and 10 Gig interfaces for the Mac Pro and my MacBook Pro (the dongle life!). As I was going to need 10Gig, I could either buy a motherboard with it built-in or lose an expansion slot to a 10Gig NIC. I may want those expansion slots (3 x PCIe4.0) for NVIDIA RTX Graphics cards (especially when the RTX 3000 range launches with PCIe4.0 support later this year!).
I guess I went for the Ryzen 9 3950X instead of a cheaper CPU as I didn’t want to have to upgrade again for at least 12-18 months (the CPUs can be swapped without needing to replace the entire machine) and the 16 cores/32 threads just got me excited. I could rationalise till the cows come home, but fundamentally more cores mean faster performance (rendering) and I wanted a silky smooth 8K editing workstation, as well as an amazing gaming experience!
How has the experience been so far?
Having missed out on the debacle that was Windows 7, Windows Vista and to be honest even Windows Millenium Edition I have come to Windows 10 without a lot of the negativity and bad experiences that other Windows users have had over the last 25 years or so. On the whole, I quite like the Windows 10 UI and whilst there have been some niggles with Windows trying to be “clever”, on the whole life so far has been relatively easy.
Ok, so I have had some issues. Office 365/Outlook no longer lets me add custom Exchange servers, so I can’t use Outlook to read my 5 different email accounts – all of which are hosted on Fido Glide (Zimbra based email service), but I was shocked and amazed to find that Windows Mail is actually usable now, and handles Exchange (EWS) mailboxes and even Google Suite/Gmail mailboxes as well as the usual POP/IMAP setups. I do honestly wonder why I’m bothering to pay for the Office 365 subscription these days (£79.99/year) as I’m using Google Docs for anything vaguely Word/Excel/Powerpoint related, I can no longer use Outlook (not that I ever really have done) and I really don’t need Microsoft Access DB or whatever else they bundle with the Office suite.
Editing on the NAS was problematic at times, not because of the network but because of Mac OS/X’s incredibly poor Finder integration with CIFS/Samba. A few years ago, Apple decided to quietly drop support for their own network file technology AFP (Apple Filing Protocol) and instead recommend everyone uses CIFS (Samba) for networking. Final Cut Pro was tweaked to support CIFS shares and AFP was basically left to fall by the wayside. Getting Samba (SMB/CIFS) to work initially was tricky, and you had to make various tweaks in smb.conf to get it to work. Eventually Samba matured and became reliable, however it has never been as rock solid as AFP (in my experience) on the Mac.
With some trepidation, I performed file system performance tests on the new Ryzen based PC using the built in 10 Gig NIC. Performance was good. Amazingly so in fact. Twice the speed of the Mac Pro 2013 (850MB/sec vs 420MB/sec) despite using exactly the same cables, switch ports, files, software, everything. The only difference was the OS and of course the new hardware. Even better are the local disks. 4000MB/second over PCIe4.0 compared to barely 1000MB/second on the Mac Pro / MacBook Pro.
So finally came time to test the entire setup in a real world scenario. Davinci Resolve rendering the last short film I made (Geneticide). The short is 7 minutes long, 4K and includes various SFX, a full colour grade in Davinci Resolve and sound design (with OpenFX effects on the sounds) in Fairlight. The project took approximately 30 minutes to render on Mac Pro and 21 minutes on the MacBook Pro with an eGPU (Radeon RX580). On the Ryzen PC, the entire project rendered from scratch in 4’48”.
I had to update the project and change the paths as I’d had so many problems rendering the project over the SAN on OSX that I had actually copied all the material to a local Thunderbolt 2 RAID array (Pegasus R8) and used that as a scratch disk. I undid the kludges (which took about 5 minutes) and pointed Davinci back at the SAN storage for all files, updated the destination path from the Mac Desktop to the Windows Desktop and hit render. I was amazed at how quickly Davinci sped through the footage, and the file it created looked perfect – no artefacts or glitches anywhere!)
What else has improved? Well, a year ago I was buoyed by Apple’s announcement of VR support coming to OSX, and I bought the HTC Vive headset. I also love playing Elite Dangerous and had previously bought the Oculus Rift Developer Kit so I could play Elite through the early beta phases as a backer. Elite sadly outgrew my Mac and Frontier eventually dropped support for Elite on the Mac because the hardware simply wasn’t up to the job. Remember that impassioned speech from Steve Jobs about Killer Graphics going forwards? It seems the boffins at Apple have certainly forgotten anyway 🙁 Even Valve, the poster boys for VR at WWDC 2017 have now announced they’re dropping Mac support on their platform.
Once upon a time the Mac was a powerhouse, the machine you aspired to own, something immutable. Now, Apple’s phones are more powerful than their desktops and Apple only seem interested in selling you a new colour strap for your Apple Watch. They may be big now, but all it will take is for another manufacturer (Samsung?) to innovate and produce a better handset than Apple and the sheep will migrate. Yes, Apple are trying to lock people into their ecosystem with Apple TV and iTunes/Apple Music/etc, but that market place is saturated and given time another Napster will emerge and chances are the customer base won’t be all that loyal to Apple.
Check back in next week for more details on the actual work flow and how I’m coping after what will have been roughly 3 weeks using Windows as my daily driver.
What follows is an attempt to explain why, after almost 20 years of being an Apple/Mac devotee (fan boy) I have finally reverted to “the dark side” … and (spoiler alert!) I’m loving it!
Having been a Windows user since the early days (well Windows 3.11 and Windows NT) and then IBM’s OS/2, as well as Linux (Slackware!), I was looking for something “better”. In steps a young Steve Jobs with a possible solution!
So, it’s January 2000, and Steve Jobs announces plans for a new state of the art operating system. Fast forward to September 13th 2000, and Apple has released Public Beta 1 of their new (to become) flagship operating system “Mac OS X”.
The promise of “state of the art plumbing” (hardware), killer graphics, designed for the Internet from the ground up … and the promise to “make the next great personal computer operating system”. Having battled with OS/2, Windows 3.11, Novell Netware and even Slackware, I was keen to see what Apple had to offer and I bought a MacBook Pro, initially with System 9 (the old Mac OS) installed and I waited with bated breath for the beta of OSX which I’d ordered online. The future was here, and it looked amazing!
Since then, there have been 15 iterations of OSX, Steve Jobs sadly lost a battle with pancreatic cancer and Apple had a few false starts in their “Pro” lineup, launching Final Cut Pro X and dropping support for Final Cut Studio on the same day (which saw hundreds of thousands of filmmakers move to Adobe Premier overnight too when they saw how incomplete FCP X was and they saw that Final Cut Studio was no more). One of the biggest mistakes Apple made here was having no migration path available for existing Final Cut users. They upgraded/installed the new version and had no way of opening any of their old Final Cut projects. (Unless they installed Premier, which had the option to import from FCP).
Apple went from strength to strength in the Pro market from a hardware perspective, with adverts extolling the virtues of the G4 and then G5 Mac Pro, so powerful it was classed as a munition and needed an export license from the US Department of Defence. They eventually moved from PowerPC processors to Intel processors and did that move so seamlessly that no-one really noticed the processor under the hood.
I was hooked! I bought G4 Mac Pros, G5 Mac Pros, iMacs .. I moved a number of my consultancy/business customers onto them because they were so easy to use and such low maintenance (this actually backfired as they were so easy to use and such low maintenance that we lost a fortune in revenue from callouts that didn’t happen to clean viruses from the machines. Windows users were so often in trouble with malware and viruses, but Mac users never seemed to experience the problem).
They also had more than a few problems with their “Pro” lineup of hardware though, the i9 MacBook Pro overheated and had to be “crippled” with a firmware update (which only worked in OSX so if you booted into Windows you would still thermal throttle). They launched the trash can Mac Pro (the iBin) in 2013. Barely upgradeable, they were out of date almost as soon as they’d launched. A small hardware refresh happened after a couple of years, but they were still underpowered and under spec’d – with no sign of anything new on the horizon. Apple finally realised this and launched the iMac Pro as a stop-gap (with an eye-watering $17,000 price tag for anything powerful enough), they then made a gushing apology at WWDC and promised a new Mac Pro that would be upgradeable, that would be powerful and that would, of course, be stylish. This was the “cheese grater”. They failed to mention that it would cost more than a Tesla and that it would still take them nearly 2 years to deliver, however.
Over the period from 2000 to 2018 I have been loyally giving Apple my hard-earned cash, falling for every marketing line they offered up, whilst slowly wondering if they were ever going to actually innovate any more (beyond the amazing new colours for their new watch straps and phone cases). The iPhone is getting bigger and bigger, the price of their kit is getting more and more expensive, and I’ve now realised that if I want to buy a Mac Pro of any relatively decent specification, I need to re-mortgage the house to do it. At the same time, I look at the actual specifications of the hardware that Apple are putting into the new Mac Pro and the prices of the components. A CPU that can theoretically handle 2TB of RAM which costs $7,500 – although that 2TB is realistically only 1.5TB – and Intel actually sell an equivalent CPU that can address up to 1TB of physical RAM for $3,800).
Custom NVMe2 SSD cards that cost 4x the price of a standard NVMe2 drive, and similar “price gouging” on RAM upgrades – and I finally woke up, Apple are milking their “sheep” fan base as hard as they can. I also started to evaluate what my work flows were and where the bottlenecks were in performance as well as what I could do to improve these either on Apple, or PC, and whether or not I could build a platform that was actually agnostic.
Disk performance test after disk performance test made me realise that whilst Thunderbolt 2 attached external storage was faster, it wasn’t scalable. £3,500 for a 24TB Pegasus2 RAID unit was pricey, but it boosted disk access times from 80MB/sec to 400MB/sec. This meant I could work on 4K video footage in Final Cut. At this stage, I’m still a heavy Final Cut Pro X user, but at the same time, I have started to explore alternatives. I have enrolled in film school with Raindance and they are pushing Adobe Premiere as the NLE of choice. At the same time, Davinci Resolve is starting to look like a serious contender. I have a debate with the lecturers at Raindance and convince them that so long as my work is delivered on time and to a standard, it shouldn’t actually matter which NLE I’m using. An NLE is an NLE. It shouldn’t matter whether I’m using iMovie, Final Cut, Adobe Premiere or Davinci Resolve – and in fact, being able to work in all of these editors should in fact be a bonus were I to try and get a job with a production house at the end of the course.
I decide that it’s time to seriously consider a switch, and spend far too long working out the ideal Intel-based i9 build, Z370 motherboard, RAM and more … and then AMD catches my eye, and Threadripper 2. These are pricey, $3,999 for a 32 core (64 thread) CPU, but boy oh boy is the whole Zen 2 architecture exciting! and then, AMD releases Ryzen 3000. Specifically the Ryzen 3950X with 16 cores (32 threads) and a price that’s shockingly affordable, it only addresses 128GB of RAM though (who am I kidding having been stuck with 16GB as the most my poor MacBook Pro could handle). I spend a few more weeks (months?) thinking about it and then finally decide to pull the trigger in March 2020.
The machine should out perform almost anything else out there currently, especially with PCIe4.0 architecture, I spec a PCIe4.0 SSD which benchmarks at over 4000MB/second in read/write tests. I choose a heavy duty X570 motherboard with built in 10 Gig ethernet and 802.11ax (Wifi 6), Thunderbolt 3 and support for Zen 2 and Ryzen 9 (including the 3950x) processors.
The build takes me about 2 hours, I spent over an hour trying to work out the optimal route for the power cables and water cooling so they would look “pretty” through the tempered glass side, and of course the RGB. I confess I am quite pleased with the result.
Ryzen and a single RTX 2080Ti wipe the floor with Apple and Dual FirePro D300’s
Cinebench R20 is equally as impressive
Ryzen scores 8949 whilst the old Mac Pro scored 1416
Ahh I hear you say, but that’s against a 7 year old Mac Pro. How does your new build compare to the current Mac Pro 2019? Well, I don’t have one to run the tests on myself – however a quick Google of benchmarks comes back with
Ryzen 9 based Windows machine is half the price and 30%-60% faster in benchmarks
Enough of the history, if you’re interested in the day to day operation and how well it fits into my workflow then read on – part two coming up next week as I make notes on how Windows 10 fits into my workflow and what (if anything) am I missing from the OSX days.
So, over the last year, I have been working more and more on video editing and producing 4K content as part of my in depth exploration of my childhood dream to become a film maker (insert mid life crisis jokes here!) 🙂
One of the biggest problems I’ve had to date has been storage space, finding enough space to keep all the video I have been creating, the B-Roll, the content libraries and more.
Having bought a Promise2 R8 Raid array with 8 x 3TB drives and Thunderbolt 2, this quest for storage has been satiated for quite some time, however as the 18TB (usable) space is being eaten up rapidly (now I am filming in 4K and 6K ProRes RAW) and I am creating more and more content on an almost daily basis, I needed something bigger …. and FASTER.
A new problem has arisen, one which I had previously not anticipated, and that is that I need storage which is also fast enough to be able to edit 4K/6K footage on. The project files are generally too large to work on my local 1TB m.2 SSD in the iBin (Mac Pro Late 2013) as that only ever seems to have 200GB-300GB of space free, and that can be the size of the cache for a single project these days. The Promise RAID solution has been good, but I’m only really seeing 180MB/sec out of the array, which is proving not to be enough as I start to render complex projects with multiple layers and effects. I’m also sometimes working on two computers simultaneously (my MacBook Pro 2017 with discrete GPU is now faster than my desktop, so sometimes I move to work on this) … I have been syncing the project file between the Promise RAID and an m.2 SSD drive, which is giving me nearly 500Mb/sec over USB-C to the MacBook … but it is only 1TB … so only really good enough for a single project at a time, and I don’t have access to the library of B-Roll I’m building … so I need to copy that from the library, which means duplicate files everywhere eating more disk space.
In my dayjob we’ve been using 10GB networks and wide area storage arrays (ceph) for years. They’re fast, efficient, infinitely scalable and relatively “cheap” compared to other SAN solutions on the market … we have 200TB+ of storage and we can grow that daily just by adding more drives / chassis into the network. This however is overkill for a domestic / SoHo solution (with 80+ drives and 20 servers and counting, this is definitely a “carrier grade” solution!
So I thought it was now time to merge my expertise in Enterprise storage and networking with my hobby and need for something which is “better” all round.
Historically, the secret to faster storage has always been “more spindles“. The more disks you have in your array, the faster the data access is. This is still true, to a degree, but you’re still going to hit bottle necks with the storage, namely the 6GB/sec (now 12GB/sec) speeds of the SATA/SAS interface, 7200RPM speeds of the disks (yes you can get 15K RPM drives, but they’re either ludicrously expensive, or small, or both).
SSDs were always a “nice” option, but they were small and still suffered from the 6GB/sec bottle neck of the SATA interface. Add to that reliability issues of MLC storage and the costs of SLC storage (article: SLC vs MLC) which made NAND flash storage devices impractical. I have had many SSDs fail, some after just a few days of use, some after many months. Spending $500 on something which might only last you 2 weeks is not good business sense).
Today, we have a new generation of V-NAND and NVMe hybrid flash drives which have up to seven (7) times the speed and much higher levels of reliability that interface directly to the PCIe interface and bypass previous bottle necks like the SAS/SATA interface. And they’re (relatively) affordable and come in much larger capacities (up to 2TB at the time of writing, although I’m told “petabyte” sizes are just around the corner).
So, the question now is how do I put all of this knowledge together to deliver a faster overall solution?
From the networking perspective, I started off looking at 10 Gig capable switches. I found a few options on eBay including 24 port Juniper EX2500 switches for £600 each (now end of life, but they’ll do the job) however I ended up choosing a brand new Ubiquiti EdgeSwitch 16-XG for £450, which has a mix of 10GBase-T and 10G Base-X interfaces (SFP+ and RJ45) so that I could connect a mix of devices regardless of whether they were via copper or fibre.
Connecting the devices together with CAT7 cables bought on Amazon for £15 and 10GTek Direct Attach cables to link the SFP+ devices (see below) to the switch. In my dayjob we have been using Mellanox DirectAttach cables, however my UK suppliers seem have had a falling out with Mellanox as despite trying to buy supplies of these for work through both Hammer and Boston (both of whom have promised faithfully to always carry stock of essential items such as these) have been unable to supply any to me despite my attempts to order them repeatedly over the previous 6 months. The 10Gtek ones work, and come in at about the same price … and ordering is a lot less painless than having to raise purchase orders and deal with wholesalers on the phone. Plus, I wanted to try and do this using only items I could buy today as a “consumer”.
Next, I looked at off the shelf NAS solutions .. the two lead contenders in the space appear to be Synology and QNAP. I placed orders for a number of different units, not all turned up, some are (still) on back order with the suppliers, and at least one supplier (Ingram Micro) cancelled my order and told me to re-apply for an account as they’d changed systems and I hadn’t ordered anything in their new system yet – despite having just ordered something in their new system .. Go figure! 🙁
My original plan had been to compare Thunderbolt 3 networked devices to 10 Gig networked devices, however as QNAP are the only manufacturer (currently) to have a TB3 equipped unit, and as Ingram failed to supply the device (and nowhere else had stock) I have yet to complete that test.
A good way to verify if a transferred file has not been corrupted during transfer to another computer is to use MD5 hash. What you do, is calculate the digital signature of the file on both sides, then compare the output. If they are the same, you are OK, if not, you need to transfer the file again.
Mac OS X, does not come with md5sum installed by default, but it comes with an equivalent tool that you can use instead. md5. To calculate the 128 bit MD5 hash of a file, run this command:
If you need the same output format that md5sum has, use this.
md5 -r [file.ext]
openssl also has a function to calculate md5 hash.
openssl md5 [file.ext]
That is all. You can now be sure that file you transferred via, ftp, http, or any other way is the same in both sides of path.