Monday, September 21, 2020

SSDs and the PS4

 Upgrading the drive, is it worth it?

TLDR: Yes. Both the scenarios with the PS4 Pro or regular. But the reasons are not quite the same.

These drives have gotten cheaper and better. With onboard cache and smart controllers the lifespan, the sheer size and performance of the SSD drives have gotten better. I just got this drive for 130$ on Amazon. 

Benefits

I'm sure only a week or so later you will be able to find a cheaper option, but the point is mostly that this is an affordable one. Though storage maker are the only ones insisting that a megabyte is a rounded 1000 bits, instead of 1024, like let's say RAM memory, hence cutting corners on actual space, that's still nearly a terabyte of storage.

Capacity

And that's where all this rant might start making sense. On the "original" or regular PS4, the base storage was 512 GB. Though this is nothing to sneeze at, it represents about 10 Bluray drives of data. So, reasonably, about 10 full size games. That's not a whole lot. 
Doubling the capacity was certainly a nice added value. In the PS4 pro though, the 1TB option is included so not so much so, for that option.

Speed

After a bit of research I found out that the theoretical limit in speed for accessing a hard drive, in both platforms is different. In the case of the regular PS4, it turns out that you can get about 300mb/sec and in the Pro version that goes up to 600mb/s.

This particular drive is rated at about 550MB/sec. So the performance boost is subjectively not as significant on the original PS4. I say, subjectively, because a high performance regular HDD with a good amount of cache could probably shave off a lot of the wait time loading levels in games as well. After all the included drive was a 5400rpm HDD. Not nearly the fastest available.

On the PS4 pro (and for this I had to look on the internet for homemade videos) the gain is much more noticeable. As the bandwidth issue becomes much more significant. (And a whole lot more of a big deal between the consoles, in fact)

Conclusion

So in the original PS4 the gain is noticeable. From the inexpensive but reliable HGST included HDD to the Crucial SSD, the difference is clear. Added with the fact that it will probably double your capacity it's well worth the effort. The process itself was surprisingly easy though possibly re-downloading all of your media/games to your drive, a bit of a pain.

For the PS4 pro I guesstimate that the performance gain would be probably even more noticeable, but cannot verify it for myself.

I think it was worth it. I'd love to hear bout your own experiences with the PRO if you have one. Or maybe even a second Gen console (slim etc...)

Leave your comments below!

Sunday, September 6, 2020

Serverless Containers on Azure

What the hoot?

Yes. There are half a dozen ways to host Docker containers in multiple cloud ecosystems. But sometimes, you need simple. And spinning up a container shouldn't always require a full Kubernetes cluster or a VM for that matter.

I had a requirement to create an existing application hosting solution, that would be isolated from the internet, but available to a client, from their infrastructure. Basically something bound to an isolated VLAN. Notably, Sonarqube.

This was to be a medium sized solution for the client and data retention was not critical. So I don't know about you, but when the words "stateless application" come to mind, so does simple Docker containers, pop in mine.

So I went about investigating how I could get this spun up quickly in my customer's Azure subscription.

I hadn't worked a lot with the Azure CLI (command line interface), and was rather fond of using PowerShell to spin my infrastructure as code. But as things turn out, it seems that the azure CLI has got more love that the native PowerShell objects to spin up infrastructure (some things you can do in the CLI, that you couldn't in native PowerShell 😒)

PowerShell to the rescue!


I had to reevaluate why I use PowerShell. I use it to automate. In the end, I am agnostic to what creates the Infrastructure. As long as it's flexible and can be saved as code, I shouldn't really care.

What I like particularly about using PowerShell, is it's ability to take JSON strings and convert them to queryable objects. You get that whole "intellisense" experience when drilling down into the objects's properties.


It makes exploring objects returned in JSON string very easy to explore and discover properties.
Also enumerating collections of, those items, if need be.

And because the AZ CLI, returns it's data natively in JSON format, it basically allows me to query it as if PowerShell created the object itself. So it's rather transparent.
The only caveat being, that it doesn't give you a handle on the underlying object to update it. But I can live with that. (Sometime's I can be practical and not so fussy, see? 😅)

So my first thought was how do we go about scripting a simple web app and a DB using PS (PowerShell) and AZ CLI


Aaaaand that's when the DevOps guy in me got woke.

Recycling is good. 'Mkay?


There is no way this can be single use. But the requirement make the practicality of the solution a bit rigid. But, what if, we added an optional Application Gateway? That way we can either firewall a specific origin network, or have an open access/authentication gated solution for those who require it!

So what do we need all in all? And in what order?

  1. A Vnet to host the apps
  2. A database (I'll use SQL server in case the clien wan't to migrate the thing to his managed instance later)
  3. A a web app container
  4. Optionally a public IP
  5. An optional Application Gateway

Spoiler Alert!👈

So we need to have some sort of control flow, to check and make sure the requirements are created and in the right order. So if we assume that we have built some functions: isvnet, isdatabase, isSonarQube
and so on, that either returns true, if the requires infrastructure exists or is successfully created, then we can do this: 


So what would the functions look like?
Here's the Vnet inspection and creation function


Now I considered using a name for the vnet, in case multiple ones need to be in the same subscription/tenant. But in reality, if we think about it. It shouldn't be necessary. A single VNET can keep all the required infrastructure isolated from the world. Unless... you have to have multiple instances of the app, isolated from themselves... so ok let's name them so we can have separate ones...in case. BUT...what if...

If we use the name of an existing VNET in our parameters, we can REUSE an existing one. To insert our infra in that one.
So our function checks if the vnet exists first and if it does, passes it along. If not, tries creates it and returns what it just created or not. If there was an error it will return nothing. Which is okay 'cause we trap it in our previous control logic.
All the other variables are defined at the beginning of the script.


Oh and by the way, there are my publicly available containers for creating SonarQube as a serverless container.

Side Note

SonarQube, runs on Java. And the built in database search engine it uses is an open source version of Elasticsearch. Elasticsearch, has a peculiar requirement to ensure it has enough memory, ressources to run properly. This can be very useful in environments where we are keeping a tight control on ressources. But in most cases, we will have at our disposal, much more than the 2 GIGs (if memory serves) that Elasticsearch requires.

However, the Engine, will query the operating system at SonarQube startup time, to see if it has that ressource reserved. Not available, but reserved. And that means, that unless you set this manually on your OS (btw this is linux based) the app won't run.

Not having a handle on the underlying OS in Azure, which hosts the containers (it's serverless, see?) caused the containers to not be able to spin up, ever.

Fortunately, the original SonarQube Docker container source is on GitHub. So luckily I could hack the startup file and tell him to ignore the check and put up my own container without any licensing issues. This is all open source. And the original author is properly credited. 😀 So says I. 

So yeah, it took some doing.😓

Drum-roll please

So how do you, create the serverless container(s)?
It turns out it's pretty simple. Pretty much the same way you create any other infra with the AZ CLI.


The container creation syntax is very reminiscent of the one on the Docker CLI when it comes to the container specific parameters (environment variables, notably).
And yes this function could be optimized and broken up in smaller parts, for controlling the hosting SKU size and the BD check. 

But it is just as quick to edit the script to change the values than to add parameters and consume them. So I didn't really see the point. (see? practical...again, yes me!)

 Conclusion

Not so bad 'eh? Because the AZ CLI returns a nice JSON that PS handles very easily, it is simple to query or set-up. Automation becomes very feasible as long as you have installed the Azure CLI (which runs on linux and windows) and PowerShell, ditto.

And considering that our friends at Microsoft will give you a free 200$ credit to play with Azure your first subscription (260$ CAN, yay for once we get parity! Thank you Microsoft 💓) there is no reason for you not to start hacking this stuff right away.

So. What are you waiting for?⏰

SSDs and the PS4

 Upgrading the drive, is it worth it? TLDR: Yes. Both the scenarios with the PS4 Pro or regular. But the reasons are not quite the same. The...