Perpetual learning is paramount for folks in any profession, but I’ve found that for individuals who work in cyber security it is absolutely critical. A significant part of the work I do involves knowing what risks lurk both in the wild (and internally) that can stand in the way of an organization’s future success. Staying up with these risks, mitigation techniques, and controls is vital.
There are all types of learning that help new concepts find a home in my brain. One comprehensive learning experience that I recommend for anyone in cyber security is an event put out each year by SANS, which is an organization that trains cyber security professionals. The event is called the SANS Holiday Hack Challenge.
This year 9-year-old son helped me in ways that blew my mind. His little mind went after small details that I thought were insignificant that turned out to be a pretty big deal. He was very excited by what he was able to uncover…and so was I.
The SANS Holiday Hack challenge introduces cyber security professionals and pen-testers to new technologies and opens their minds to risks and mitigation techniques that they had not previously considered. I greatly enjoy their ‘terminal challenges’ which provide hints toward solving objectives. Never before had I decrypted http2 traffic using Wireshark and SSL keys. So awesome! Here’s the link for this years’ challenge which has been a wild ride for me, to say the least: https://www.holidayhackchallenge.com/2018/.
Stop in and poke around. Solve a terminal challenge or two then put it on your holiday to-do list for next year. You won’t regret it!
For Christmas we got our son an Arduino Uno starter kit. It’s not officially and Arduino, though. The hardware specifications are the same, but it is made by a company called Elegoo. What we purchased was the “Complete Starter Kit”. I highly recommend it. So far we’ve made prototypes for the following: 1) blinking LED lights, 2) joystick controlling a servo motor, and 3) an ultrasonic sensor that tells us how far objects are from it. There have been a few other things, but those are what come to mind as I write.
Besides being extremely fun an interesting, these prototypes foster a new understanding about all the electronic things we use and how they may be wired. We could have gotten a kit for a robot or a remote controlled car, but testing out a range of sensors seems to broaden our view of what’s possible. If we decide on a full project, we’ll have a much better idea of what we’ll need and whether it will work.
Also, as a side note, since I’m using my Chromebook for these project I’m not using a locally installed IDE. Instead, I’m paying $1 a month to use the cloud service provided by Arduino for building sketches. So far it has worked flawlessly. Though ChromeOS does have a linux sandbox now. I’m going to see if I can install it that way, too.
If you’re like many IT professionals who’ve had anything to do with large amounts of data, you’ve become immune to the phrase ‘big data’. Mostly because the meaning behind that phrase can vary so wildly.
Processing ‘big data’ can seem out of reach for many organizations. Either because of the costs in infrastructure required to establish a foothold on this front or because of a lack organizational expertise. And since the meaning of ‘big data’ can vary so much, you may find that you’re doing ‘big data’ work and then ask yourself, “Is this big data?” Or an observer can suggest that something is ‘big data’ when you know full well that it isn’t.
With my own background in data, I’m ever curious about what’s out there that can make the threshold into ‘big data’ seem less insurmountable. Also, I’m interested in the security considerations around these solutions.
In the last week or so, I’ve gotten more familiar with AWS s3 buckets and a querying service called Amazon Athena. Here’s the truly amazing thing. You can simply drop files in an s3 bucket and query them straight from Amazon Athena. (There are just a couple steps to go through, but they are mostly trivial.) And for the most part, there’s not much of a limit for how much data you can query and analyze. You can scan 1tb of data for $5. What? That’s right. And you didn’t have to set up servers, database platforms, or any of that. I’ll be exploring Amazon Athena more and more over the coming weeks. If you have an interest in this sort of thing, I suggest you do the same.
One note: Google has something similar called BigQuery, so that might be worth a look as well. I’ve explored BigQuery briefly but I keep coming back to various AWS services since they seem to be holding strong as a dominant leader in emerging cloud technologies. But as well all know, the emerging technology landscape can change very quickly!
For some time, I’ve been interested in learning about the Raspberry Pi. It’s little a bare bones computer that packs a big punch. And to top it off, it’s quite affordable. Through work I heard about a way to use a Raspberry Pi for an OS called Retropie. Retropie is an emulation platform that let’s you play scores of old games…if you have the digital files for them, of which many can be found with the help of Google.
I’m not much into modern video games, (as in games from the last 20 years or so), but I did play NES games back when I was in jr. high and high school. And I do still have my original NES, but it has a number of issues that make it less than reliable for playing. My kids are interested in the older games because I’ll actually join them when they play. And, quite frankly, because the older games are super fun to play and easy to learn.
Anyway, Retropie is a great way to learn how to use and get familiar with the Raspberry Pi. You simply, burn the Retropie image on a micro SD card, pop it in the micro SD card slot and boot it up! There are a few other things you need to know, but that’s the gist of it. Get a few games, a controller or two, have a monitor with an HDMI plug-in handy and you’re good to go. That’s a bit of an over-simplification, but please do explore Retropie and Raspberry Pi if you’re at all interested in this sort of thing and are looking for a good way to get familiar with the Raspberry Pi world.
These days efforts to revamp company culture are in vogue. I’m going to attempt to articulate what I see as a connection between machine learning and efforts to change company culture. Stay with me here a bit because the analogy doesn’t show up until the fourth paragraph and I need to share a little bit of background first. 🙂
One group leading the charge to change company culture is Partners in Leadership (https://www.partnersinleadership.com). They use a tool that identifies the following flow toward changing results. It’s a pyramid that moves from experiences to results in the following steps: EXPERIENCES >> BELIEFS >> ACTIONS >> RESULTS. According to the model, you start with the results you want to see as an organization and then move backward until you’ve arrived at the experiences that you need to create. The thinking is that experiences shape beliefs, which shape actions, which shape results. They maintain that you cannot simply skip ahead results until the rest of the house is in order first.
As for the experiences, they actually need to be high quality experiences. Partners in Leadership breaks these experiences into four types (big paraphrase here): 1) Easy to interpret, 2) Needing work to interpret, 3) Very little meaning, so there isn’t much to interpret, and 4) Experiences that, well, kind of did the opposite of what they were intended to do.
Now it is time for the machine learning analogy! Boiled down, machine learning is essentially learning from experiences (data) in order to shape beliefs (trained statistical models). These beliefs/models turn into actions (acting on the outcome of a model), which leads to results. Critical to this process is the experiential data and its interpretation (the model). We train our models by feeding data (experiences) into them. Why am I making this connection? Because organizations are really struggling to understand machine learning. Why not piggy back off of something that they’re learning already? Results from machine learning algorithms are no different results gleaned from an organizations’ cultural change initiatives. What data do you have that you can use to shape your statistical models? Which actions do you need to take to get results? You can change your culture and understand machine learning at the same time!
How much of the world’s IT infrastructure is in the cloud now and much of it will be in the cloud in five years? I’m sure there is nearly solid data somewhere to answer those questions. Regardless, it is happening and it won’t be long until most IT infrastructure is in the cloud.
Oddly, though, in my conversations with other IT professionals, it seems like we’re finding we’ve arrived late to the party. With the advent of “the cloud” organizations are finding that there are all sorts of solutions out there that don’t necessarily need the involvement of traditional IT. In much of the IT world, our perception is that this process is more gradual when in fact it is accelerating.
So the real question is not whether “the cloud” is coming, but whether we see it coming. If we want to make sure cloud implementation is done properly and doesn’t completely hose our respective organizations, we must learn as much as we can in a very short period of time.
Nearly every day I find myself reading about cloud security risks right along side incredible cloud solutions for problems that would normally be much harder to solve. At the same time, many cloud solutions create problems that we’ve never seen before. With the flip of a switch something private can become public: see S3 buckets. And it isn’t so much that the cloud is insecure, but how we connect to the cloud, whether this is through our API infrastructure or open ports that maybe shouldn’t be…open. The only answer I have for all of this is that we need to learn, learn, learn, learn…and fast.
I’ve been a Linux user at home for quite some time. We were a Windows family very early on but ran into issues with viruses. I resurrected a super old laptop and put Lubuntu on it and gave it to my wife. It worked well for years. After a while, one thing or another wouldn’t work, so on a whim I got her a Chromebook. Nearly everything she does is online, and she’d already started using Google docs when on the Lubuntu PC. As a result, the transition was peachy! After watching her tote that thing around the house for a year or so, and noticing how carelessly she worried about charging the battery or booting it up, I decided I needed one too!
It’s done quite well for me. Occasionally, I have to jump over to my Ubuntu desktop for more high-powered activity, but 80% of my computing at home is on the Chromebook. This experience and the evolution of computing as it moves into the cloud is leading me to believe that the days of everyone running around with what is essentially their own personal server, are numbered. I’m guessing in about five to eight years, computing will be cloud focused even more than it is now and people won’t really own traditional laptops any more.
I’ve got just about everything marked off my list on the AWS learning front for the weekend. This domain is now transferred over from my old host: jeshuaerickson.com. I started up a WordPress instance using AWS Lightsail. Then I assigned a static IP to that instance. I also got my DNS zone set up. Finally, I got my SSL cert completed and integrated with Apache. (There are pretty straightforward Bitnami guides for this. It’s not done through the regular “Services” interface. Just remember when you’re in Lightsail, you’re in the Bitnami world now!)
The other piece that I worked out was taking a snapshot and doing a restore, which is basically getting rid of the old instance and assigning the new instance to the static IP I created. I was expecting to see a “restore” button in Lightsail, but that’s not how it works. Makes sense now that I’ve gone through the process once. (I had to do this because I hosted the SSL cert integration the first time around.)
Throughout all of this I am attempting to keep track of AWS billing. I had some Directory Service charge pop up and didn’t find it until I started poking around in another region. I did some quick back and forth with AWS and got a credit for those charges. Ultimately, I was very happy with how responsive they were. The variety of services that they offer inside AWS is INSANE and billing can get a little tricky to navigate if you’re not familiar with the AWS administrative console.