Video Transcript: Hardware
Hello and welcome to week two of managing information systems again. I'd like to start us off in just a little prayer of Dear Heavenly Father, I'd like to thank you so much for this opportunity for us to learn together. I pray that You would bless every student who might be watching there as they trying to learn this material,
you would help them use this information to benefit you and be a light for others in their workplaces and their families and their churches, and that you would watch over them as their as we all learn together and we're following your path, as we're learning, as every time we learn, we know that we're we're going exactly where we're supposed to be to learn something new to benefit your kingdom. So in your heavenly name, we pray amen. As we jump into week two, we're going to be talking about the hardware involved in information systems. So as as a reminder, there are five components of what makes up an information system, the hardware, the software and the data. Those first three are going to make up the technology component, and then the next two are people and processes are going to be that human side, and today we're talking specifically about the first one, which is the hardware. And if you're thinking about what makes up hardware, there could be so many things that come to mind. Here's an extensive list here of anything that you can physically touch that is going to make as part of that technology, from your desktop or laptop to the components that go into those things, your thumb drive, your flash drives, anything that your your mouse, your your keyboard, anything that you're talking into your microphone, if you're Using a camera as part of your video input, anything that's going in or out of the technology. And the physical technology itself is the hardware. And so it sounds simple enough, but we're going to really jump into all those components that go together and some things that you might not even think of that are now technological pieces of hardware. If we look forward to our daily interaction, the things on the list in front of you are pretty straightforward, and things that you probably would think of pretty easily as being technology hardware, where it gets so now think about your car, maybe your refrigerator. Some of you for your toothbrush, I have a picture frame in my house that have now become pieces of technology, hardware that used to just be pretty standard objects, right? Your toothbrush didn't have a technological component 50 years ago, when we were talking about that progression of information systems and where we're going. So now, when you get into a car, think of all those pieces of technology that are coming together. And so at what point does that car become a piece of technology hardware for that purpose? And we're going to jump into all those things that make up of exactly the litmus test and what makes up a piece of hardware, your refrigerator? I have a friend. I go to her house, and when she pushes a button on the refrigerator, there's a camera inside that tells her exactly what she's running low on and gives sends a shopping list directly to her smartphone. At what point does your refrigerator become a piece of technology hardware, and we're going to talk about some of these examples.
And what makes up a digital device. So what really is a differentiator to make a digital device is going to be bits and bytes. And so what are bits? So we're looking at our digital devices and anything that has an electric signal that represents either a one or a zero, and you've probably heard of binary code, this is something that we're going to be talking about on just a really broad level. This course we're not getting into, of course, the programming of the binary programming language that you would need if you're going to be creating a program or a piece of software. But we are going to be talking about those digital devices and what makes them one, and that's going to be an on switch, which is a one, or an off switch, which is a zero. So that communication language that goes on in a digital device is binary code, and that's going to be zeros and ones. That word bit that we're talking to are these ones and zeros. And bit is just an amalgamation of the words binary and digit. So the zeros and ones are the bytes. I'm sorry, they're the bits. Excuse me, every time you have eight bits, you get a byte. Okay, so you might have heard about bit processors. So we had eight bit processors, they could process eight bits of information, so eight zero and ones at a time, which was a huge leap forward in the technology. And now that we see we are talking about 64 bit processors. It's pretty typical for what we're running on modern computers. And so that means we're two to the 64th power of the amount of binary digits that get processed per second. And so the I mean, it's a huge leap forward to get to the eight and then a huge leap forward to get to the 64 bit processors with some stops in between. There was a 16 and a 32 bit processor. And this is all. I don't want you to get intimidated by the processing speeds and these this idea of binary code, but it's just a way for you to understand just how much data we're really talking about. When it how much can a digital device really handle and process and store for us? So as we're talking about that, we're going to jump to these binary prefixes. This might seem familiar to you from maybe a math course, a science course, where we're talking about these prefixes. So recently, I bought a new computer. It was sitting right here, and we're talking about the storage capability in terms of terabytes, which sounds like it's a large piece of data, but I was wondering, So how big is a terabyte? So we're going to be talking about that right here with these prefixes, kilo meaning 1000 bytes. Now, remember, every byte is eight bits. So, so a kilo is 1000 a kilobyte is 1000 bytes. A megabyte 1 million bytes that get processed. A gigabyte, of course, is a billion. And then, now, when we're talking about a terabyte, it's 1 trillion bytes, which is 8 trillion bits happening simultaneously that get processed. So if you can comprehend the sheer volume of moving from a bit to a terabyte, of the amount of data that's possible and what's now possible for it, for moving from a bit to a terabyte. Really, getting a handle on these prefixes will help you kind of understand the progression of size, from kilo to mega to giga to Tera. And so kind of having those in your mind, when you see something that's a file, and it's a file size of x number of kilobytes, and then you see a file
that is maybe a process that's running three gigs, that's a difference between 1000 bytes and a billion bytes. So these are very large steps up in terms of data. And as our information progresses and as our capabilities progress, you're going to see this happen even more dramatically and drastically as we're talking about our processing speed. So here's a closer look at that binary and again, I am certainly not asking you to be able to take a number and to turn it into binary code, but I want to give you a little background of how it could be useful to you, and so that you can kind of understand what's happening here. So just like in math, we take the number 216 if you go back to your to your basic math days, that six is in the one column. It's two to the zero power, it's the one. So, so we think about six times 10 to the first power, it's going to be six. It just stands for six because it's in the one column. We look at the one, we know that's in the 10s column, and that that one is really representing the number 10. We look at the two, which is in the hundreds column, so it's really representing 200 so that 216 really means 200 plus 10 plus six, which we know, of course, when we look at that number, 216 is 216 when we look at that number as a decimal, if we look at the Chart below that we're going to look at the let me get rid of my picture here, so you can see the whole chart. As the number gets bigger and we add digits. Remember, binary code is just ones or zeros, so each each column is going to have a one in it or a zero. It's either on or it's off. So in the second digit, it's going to be two to the first power. So this will be in measures of two. The third place value, or digit, is going to be two to the second power, which would be four. So the pattern will be twos, then fours, then eight, then 16, then 32s, then 64, 128, 256, and so on. And that's why, when you see things measured in technology in terms of storage, space, memory, processing, you see these numbers often used. That's the reason that your computer processor is a 64 bit, not necessarily a 65 bit, because we're looking at that seventh place value. So if we break down the number 216 into its binary version, we would see that that is 11011000, which, of course, doesn't mean anything if you're not a coder, and certainly would not expect you to be able to come up with that. But I wanted you to see how it is created, so that we look at that very last column that that has a value of 128, that eighth digit, since that is two to the seventh power, that's 128 plus 64 plus you see there's a zero in the 32 column and then another one in the 16th column. So it's 128 plus 64 plus 16 plus eight. All the other ones are zeros, they are turned off, which would add up to 216 please don't be intimidated by this. Definitely not an expectation that you're able to take big values and turn them into their binary values. But I wanted you to become familiar with how those prefixes and how the order of the digits affect the binary coding and the binary value of those so, yeah, unlike the decimal system, we do know that in binary, each digit is either on or it's off. So the more that you can kind of wrap your mind around that concept as you go through each place. Value will definitely help you to kind of understand that computers don't talk in an alphabetical language. It's
zeros and ones, it's on or it's off. So we're going to move forward, and we're looking at a tour of a personal computer. We're going to talk about PCs, because you're probably really familiar with PCs. I myself am working right here on my laptop computer in my other workplace. I have a desktop computer. Most of you have probably interfaced with one of these before. So because it is a model for we're going to use it to talk about digital devices. We know there's all kinds of digital devices, but if we, if we're going to look at it through the lens of a personal computer, every here's what they have in common. Okay, they have a CPU, a central processing unit. It's going to have a memory component. It's going to have a circuit board. You've probably seen or heard of the motherboard on your computer, and how those connections are made. We're going to talk about storage and input and output devices. So these are all in common for most personal computers that you're going to see, and we're going to talk about those a little bit more in depth here, CPU, your central processing unit. Okay, this is the processing speed of your CPU is going to really determine kind of the the performance of the device that you're using. It carries out the commands that are sent to it by the software. We already talked about hardware, it's kind of useless if it doesn't have software telling it what to do. And so this is kind of the brains of the operation. And so this gets told by the software what to do and acts upon. It returns results to be acted upon. So it does some, some type of process of data in and spits data out to be acted upon. So. CPU, again, is going to be the brains of your device, the clock time how fast it is. And you'll hear in the organization, in this, in the IT world, this is known as clock time of how fast your CPU is processing. That's measured and hertz. And again, if you think back to just a few minutes ago, when we were talking about those prefixes, things like gigahertz, kilohertz, megahertz is going to be that quantitative effect for measuring processing speed. Processors can have one chip, they can have more than one chip. So if you have two processors, it's like you're running two full computers or two full processors happening for each one. So a dual core computer, if you've heard of this before, if you're shopping for a new computer or a laptop, and you hear that there's a dual core or a quad core processor, it really means that there's four processors on that chip, and that that's going to be just totally increasing the processing power of your computer and providing capabilities of multiple CPUs in one device. And as time marches on, they're getting smaller and smaller and smaller these devices that have these incredible processors and the power of having multiple CPUs on a single processing chip, which sounds really complicated, but that's how you're going to if you're looking to have optimal speed, if you're in an organization that needs to have a lot of graphics in real time, if you're uploading a lot of video and audio, downloading a lot of video and audio, and needing to have those processing commands carried out very quickly, your CPU processing Time is going to be very important to you. And so how much what your clock time is is something that we're definitely
going to want to measure. The speed of processing is always getting faster. There is a there's a law, Moore's law, and we're going to be talking about that in just a second. This is the number here, I'll show you. Moore's law is Gordon Moore, who works who was one of the founders of Intel this is this principle that he came up with, that the number of transistors on a processing chip doubles every two years, so that that march progress forward in terms of technology and processing speed will always double every two years. And there was a huge if we look at the chart here, you can see the general trend. It's a little tricky, of course, to make out exactly what all these advances are, but we can see each one of those is represented. Each data point is representing a leap forward of a new processing, processing capability. That's a data point for this. So starting back in the 70s. This ends in 2020 and we can see that it really does follow that line of this Moore's law. In 2020 there were lots of kind of controversial there were, there are many articles you might be very familiar with them that said, you know, is Moore's Law dead. Now, there's no way this can continue. Now that we've made this many advances, and the chips are so small, and you can't shrink things much more than this, how can this path be possible to have Moore's Law continue? And I would answer there were many that said by the year 2020, year 2022 Moore's law is going to be dead. And I would argue that's certainly not the case that Moore's law is still very much in effect, but we are starting to change the way that we measure processing speed, because now that we are really getting into more machine learning we're getting into artificial intelligence, and some of the ways that these processes are being taken even offline, and the way that computers and people are now able to communicate through AI and through machine learning, the progress, I believe The trajectory will not change, and the data supports that the trajectory will not change, but the way we measure some of these processing speeds are just going to be done in a different way as we evolve. So it's a really exciting time to be involved in the actual hardware, and I think it's really exciting if one of you. Maybe your your goal is to be a really adept end user and know this, know the hardware in and out. Or maybe your goal is to be a developer on some of this hardware and to contribute to this, to contribute a data point to to Moore's laws chart here of what's going to be your your advancement when it comes to processing speed, however, that looks in the future of making that progress forward of being able to do more powerful calculations and more powerful computing with less space, which is truly impressive. So we have the processing speed. Next we're going to talk about the memory. We have hard disk memory, which is, it used to be spinning disks inside of the big metal box of your computer, the hard disk or your hard drive, if you say, thanks to your hard drive. Now there are many, many opportunities to use this SSD, and you might see this if you're shopping for a new device. This SSD is solid state drive. It's pretty dominant in the market now, and it has the same function as a hard drive, or a hard disk, which is long term
storage. This is going to be instead a flash memory. And so I would like you to think of this as kind of the whole pantry in the kitchen. This is where all your ingredients get stored long term. So if you need them, they're there. But next we're talking about the RAM, which is the random access memory. This is working memory, as you're going to be using something that pulls it for you. I'd like you to think of this in the kitchen, so you have your whole pantry of ingredients. We're going to pull some things out for what we're making for dinner tonight, and we're going to put them out on the prep table. We can start chopping some vegetables. We can start washing some fruits. We can start, you know, prepping our recipe in the RAM. This is the working memory of the computer before we when we're done, it's going to put it back into the hard drive, or the SSD back in the pantry. When we're done, it's still there. It's in long term memory, but we're not using it right now as part of the working RAM memory. There's also removable memory, and this is things like, if you have it used to be a floppy disk or a CD in your computer, a thumb drive, now, a flash drive, SD cards, these pieces of removable hardware that can hold, if you have an external hard drive that you plug right into your computer or your laptop, and you're able to have that be a separate removable memory piece would also work. So you've got your long term your hard drive, you've got your RAM, which is your random access, your prep table in the kitchen, and what you're using right now, so it can access it faster. It doesn't have to look through every file. It knows exactly what you're looking for. And then again, your removable memory, okay? And then we have our inputs and our outputs. So this is going to be anything that helps you communicate with the machine and helps the machine communicate with you. So this is things like your mouse, so you can give direction your trackpad on your computer. If you have one of those, this is going to be your keyboard, your microphone that you give audio information into the computer. If you're on a video chat and your camera is going to be a device that's going to get information, your image and your sound and put it into the machine, and then, of course, back out to your machine, through the monitor, through a TV screen. This can be done through wires, through a wired connection. Let's say your keyboard is plugged into your CPU, or let's say that your your mouse has a long wire, it can also be done wirelessly, and we talk about Bluetooth, and what Bluetooth has really done in terms of changing the accessibility where things information can move from the machine to the person and back without necessarily having a hardwired connection now, of being having wireless options to help that move forward. So a big part of information systems are these inputs and outputs, because they help us talk to the machines, the machines talk back to us. Hope that makes sense. All right, some more hardware would be mobile computing. And again, we talked about the big desktop computers, things that are they stay put. And now with mobile computing, laptops, smartphones, tablets, these are going to be things your
e-readers, things that you can take with you, and you are computing on the go, and you don't necessarily have, you know, all the wires. This is a fair modern convenience, and now it's certainly over abundantly available to all that. You can hail a taxi from your phone walking down the sidewalk. You can have a doctor's appointment through telehealth. You know, as you're on the bus, all these things that make mobile computing possible, the hardware component of those things, of course, there's the other components, the software and the networking, but we're talking specifically about the hardware that makes that possible. Those are things like your smartphones, your laptops, your tablets, iPhones, that you can just press right on and communicate back and forth with that machine in the physical hardware. Lastly, we have the integrated computing, and this is when it's, it's, it's kind of like a transformer. So it's both it is this object and it is the piece of computing hardware. So talk about the smart home, and this is going to be, there we go, the integrated computing, the smart home capabilities. So when your your light bulb isn't just the light bulb anymore. For some people, they say, hey Siri, turn up my lights. Or Hey Siri, turn on my music. It's not just the speaker anymore. It's going to be the things that are interconnected. My Siri thought I was giving sorry, giving some instructions. So with the smart home, with the self piloting vehicles, whether that be, you know, all the advances. Now, if we look at a Tesla going down the street in self driving mode that integrated computing, it is a car, yes, it is also a piece of technology hardware. It meets all those qualifications as well. It has a CPU unit in it, it has memory, it has inputs and outputs, all those things that make up a piece of hardware that's now integrated with something else. So really exciting time. I know I keep saying that, but I'm it's a field I'm very passionate about, just because there's, it's endless. I mean, there's so much you can do with integrated computing. I would love to challenge you to think of what's another everyday object that will likely end up getting integrated with computing capabilities. We talked about the toothbrush earlier. It used to just be a toothbrush, and now it stays and tells me exactly when it's time to brush and when it's time to stop. And it has plays. I have a kid's toothbrush that plays the music for the exact amount of time they're supposed to keep brushing. And then all these things that are possible. What's an everyday object that's useful as it is, but could be even better if it was integrated with some of this technology that we're talking about today. All right, so the personal computer, again, we talked about this in the last module of kind of what that expectation is now, of it being really expected that you kind of have access to these, these pieces. Is it a commodity, these personal computers, and the data, the data would tell us that, yes, because there's minimal, minimal differentiation among manufacturers. Hundreds of manufacturers make these components for hardware, and then dozens of companies take these pieces, these parts and components, and put them together into these pieces of hardware that you can buy. And there's such minimal differences if you take
away the brand name from an object, and we're going to take Apple out of this for right now. But if we look at general laptop, A and B, and you don't have the logo, they're so similar that it really has become a commodity with such small
profit margins at this point, because there's, you know, it's just so prevalent. So how can organizations keep their competitive advantage and be differentiated. We're going to look at Apple for that with differentiating its components. They're
not the same components that are used in other machines. They are proprietary. And again, as time marches on. And things are looking very similar to Apple products that kind of naturally. There's a regression of how proprietary they are, but the way that Apple has tried to differentiate itself was by not participating in this use of all the same. You know that anyone can manufacture these components, but as having different components, and so in the supplemental materials, I have a case study all about Apple differentiating to kind of combat this personal computer becoming a commodity with such small profit margins and and how they really if you if you've purchased an Apple product lately, you'll know that that's not necessarily the case. All right, we're going to wrap up this week talking all about the hardware by kind of just summarizing when we think about a piece of technology, when we think about the actual hardware involved. It's a physical, digital component that you can touch. We talked about what makes up a digital component, the memory, the CPU, the processing, the storage, the input, the output. But if it's a physical piece of digital technology that you can touch, that is hardware, and hardware, again, is one of the five components that makes up an information system. It's the first one, the hardware. And then we're going to talk about the software, the data, the people and the processes. And I really look forward to digging into those components with you next. And I hope you have a wonderful day. Thank you so much.