Advertise here
Advertise here

machines

now browsing by tag

 
 

New fitness business introduces computer-controlled resistance machines

A heart attack and diabetes diagnosis led Jay Sheer to a workout routine, diet changes, nixing smoking and eventually, to a fitness trainer certification.

Now, he and wife Vicki Sheer are channeling their new-found healthy lifestyle into Sheer Fitness, a business they opened Dec. 1 at 2601 S.W. 21st St. in the Plaza 21 Shops.

The store features a new type of exercise equipment, ARX or adaptive resistance exercise, Vicki Sheer said. Instead of using traditional weights for resistance, ARX equipment uses computer-controlled motorized resistance as an individual exercises.

No other Kansas businesses yet use the equipment, she said.

“It matches the force exerted by the user, so it’s always the perfect rep,” Vicki Sheer said. “There are no weights to drop. You also beg more benefit because our bodies can lower a lot more weight than we can lift. When you use traditional weights, you’re always limited by the amount of weight you can lift. With this, it’s like someone giving you a whole bunch of extra weight when you lower it.”

Vicki Sheer said her husband discovered information about the equipment online, and they flew to Austin to see it in person and learn more about how it worked.

“We were so impressed with it, we knew we had to bring it to Kansas,” she said.

Mark Alexander, co-founder and CEO of ARX Fit, said the company was founded in 2011 but the next four years were spent in researching and developing the specialized equipment. Sales began in 2016, and he’s seeing growth in the company, primarily in urban centers.

When introducing something new to the market, it’s necessary to educate people about the benefits. But Alexander pointed out that resistance training isn’t a new concept and is well recognized to bring about health benefits.

The difference at ARX is that the machines offer computer-controlled, motorized resistance, shortening the time needed to exercise and also, Alexander said, making it safer.

Weights can be challenging to use, especially for a novice, but it requires a few minutes to understand how ARX machines work and to use them.

“It’s good on really either end of the spectrum, whether you’ve been hurt or you’re a little tentative or an exercise novice,” he said. “The system adapts to the user. You’re not necessarily going to guess wrong on the weight. The system will only give you as much as you can put out. On the other end of the spectrum, if you’re a competitive athlete or weekend warrior, it will help you get stronger, and do that better and be more injury free.”

Alexander recommended individuals work out on ARX machines once or twice a week.

Balance –exercise is a very simple analogy; a sun and a suntan, really more is not necessarily better. It’s a stimulus that gives you something; you have to respect that if you will with intense exercise. A, you don’t need a lot and B, you don’t need it all the time.

Changing their own lifestyles so dramatically — Vicki Sheer also quick smoking — was tough on the Sheers, but the two have been successful. Now, their story brings an added impact to their new business, where they offer free demonstrations on the equipment because the concept of it is so new. The computer-controlled machinery tracks the users’ progress, she said.

“When you’re doing your workout, you can look at a screen that shows exactly how much force you’re exerting every single second you’re working out,” Vicki Sheer said. “That is saved to the cloud, so next time they come in, they can compare their progress. I just put something on Facebook that showed that I had almost a 70 percent increase in strength in my legs in seven weeks time.”

Because the intensity of the workouts is higher than when using regular weights, Vicki Sheer agreed a once-a-week workout is recommended. Sessions begin at $25, but there are reduced rates for buying multiple sessions.

Sheer Fitness is the couple’s first business, and Jay Sheer is continuing to work fulltime in other work. The store is available to be open six days a week, and the Sheers plan to work by appointment. But since it’s new, they’ll be there most days from 9 a.m. to 4 p.m. so that people can drop in for a demonstration of the equipment and a free fitness assessment.

Alexander said the data being collected by the ARX computer interface will help his company track and understand the benefits of using the equipment.

“The evolution has really been that we are becoming a technology company, or an exercise tech company,” he said, rather than simply a fitness company. “We want to start being able to compile some of the data, not the demographic marketing type of data but the biometric data of exercise. A lot of people have spent a lot of time money and effort in the biometric data of exercise on the endurance or the cardio or aerobic side of things. How many steps, how many miles, the heart rate. We’re just doing that but in resistance exercise.”

Although it’s relatively early in the data-gathering process, Alexander said the company is seeing data to indicate “marked improvements” in bone mineral density and also progress made in controlling diabetes.

For more information about Sheer Fitness and to schedule a demonstration, call the store at (785) 414-4568 or visit them online at www.sheerfitnesstopeka.com.

Rise of the BIOHYBRID MACHINES: Robots part HUMAN part ANDROID on the way

Lead author Leonardo Ricotti, of the BioRobotics Institute at the Sant’Anna School of Advanced Studies, in Pisa, Italy, told Live Science: “You can consider this the counterpart of cyborg-related concepts.

“In this view, we exploit the functions of living cells in artificial robots to optimise their performances.”

If robots on a microscopic scale – nanobots – can be fine tuned using muscle cells or carrying beneficial, they will be able to explore the human body and help to cure ailments in a specific part of the body, such as cancerous cells.

Windows 10 now on 600 million machines, not all of them PCs | Ars …

5334b_windows-on-all-devices-640x273 Windows 10 now on 600 million machines, not all of them PCs | Ars ...

Microsoft CEO Satya Nadella told shareholders that Windows 10 has now passed 600 million monthly active users, picking up 100 million since May of this year.

This number counts all Windows 10 devices used over a 28-day period. While most of these will be PCs, there are other things in the mix there: a few million Xbox Ones, a few million Windows 10 Mobile phones, and special hardware like the HoloLens and Surface Hub. The exact mix between these categories isn’t known, because Microsoft doesn’t say.

The company’s original ambition (and sales pitch to developers) was to have one billion systems running Windows 10 within about three years of the operating system’s launch. In July last year, the company acknowledged that it won’t hit that target—the original plan called for 50 million or more phone sales a year, which the retreat from the phone market has made impossible. But at the current rate it should still be on track for somewhere in excess of 700 million users at the self-imposed deadline.

Windows 10 isn’t the first release that Microsoft has published user numbers for, of course, but the speed of adoption has a particular importance as the company tries to encourage developers to build apps for the Microsoft Store. The company wants developers to either produce applications using the new Universal Windows Platform APIs, or failing that, to package their existing applications using the older Win32 APIs using the Desktop Bridge (formerly Centennial). In both cases, Microsoft wants devs to sell, distribute, and update those applications through the Store. With UWP and Centennial being exclusive to Windows 10, assuring developers that the market is big enough to be worth targeting is important for achieving this; sluggish Windows 10 adoption would leave developers more inclined to take the safe bet of targeting Windows 7 and ignoring UWP and the Store.

5334b_windows-on-all-devices-640x273 Windows 10 now on 600 million machines, not all of them PCs | Ars ...

Adoption of the latest Windows 10 version, the Fall Creators Update, version 1709, continues to outpace the uptake of the Creators Update, 1703. Using figures from AdDuplex, version 1709 is on a hair over 20 percent of Windows 10 machines after being widely available for a month and ten days. That’s up from five percent a month ago.

AdDuplex’s numbers also give some insight into the make-up of the market for Microsoft’s Surface-branded computers. The new 2017 Surface Pro seems to have gotten off to a solid start, with just over nine percent of Surface-branded systems being this latest model. Surface Laptop, by contrast, appears to hold a much lower share at just two percent. AdDuplex’s numbers are driven by usage of apps from the Store. Part of this difference in share is likely to be due to different user behaviors; on the one hand, tablet users are probably more inclined to be interested in Store apps, as Store apps are more likely to be accessible to tablet systems. On the other hand, the Laptop defaults to Windows 10 S, which can only use Store apps (though it can be freely upgraded to Windows 10 Pro, which has no such restriction).

5334b_windows-on-all-devices-640x273 Windows 10 now on 600 million machines, not all of them PCs | Ars ...

Which effect is more significant is hard to say, but either way, it continues to appear that people who want the Surface brand want the flexibility that Surface Pro boasts. That only further cements our confusion that the Laptop didn’t include a 360-degree hinge.

Windows 10 now on 600 million machines, not all of them PCs

8fe86_windows-on-all-devices-640x273 Windows 10 now on 600 million machines, not all of them PCs

Microsoft CEO Satya Nadella told shareholders that Windows 10 has now passed 600 million monthly active users, picking up 100 million since May of this year.

This number counts all Windows 10 devices used over a 28-day period. While most of these will be PCs, there are other things in the mix there: a few million Xbox Ones, a few million Windows 10 Mobile phones, and special hardware like the HoloLens and Surface Hub. The exact mix between these categories isn’t known, because Microsoft doesn’t say.

The company’s original ambition (and sales pitch to developers) was to have one billion systems running Windows 10 within about three years of the operating system’s launch. In July last year, the company acknowledged that it won’t hit that target—the original plan called for 50 million or more phone sales a year, which the retreat from the phone market has made impossible. But at the current rate it should still be on track for somewhere in excess of 700 million users at the self-imposed deadline.

Windows 10 isn’t the first release that Microsoft has published user numbers for, of course, but the speed of adoption has a particular importance as the company tries to encourage developers to build apps for the Microsoft Store. The company wants developers to either produce applications using the new Universal Windows Platform APIs, or failing that, to package their existing applications using the older Win32 APIs using the Desktop Bridge (formerly Centennial). In both cases, Microsoft wants devs to sell, distribute, and update those applications through the Store. With UWP and Centennial being exclusive to Windows 10, assuring developers that the market is big enough to be worth targeting is important for achieving this; sluggish Windows 10 adoption would leave developers more inclined to take the safe bet of targeting Windows 7 and ignoring UWP and the Store.

8fe86_windows-on-all-devices-640x273 Windows 10 now on 600 million machines, not all of them PCs

Adoption of the latest Windows 10 version, the Fall Creators Update, version 1709, continues to outpace the uptake of the Creators Update, 1703. Using figures from AdDuplex, version 1709 is on a hair over 20 percent of Windows 10 machines after being widely available for a month and ten days. That’s up from five percent a month ago.

AdDuplex’s numbers also give some insight into the make-up of the market for Microsoft’s Surface-branded computers. The new 2017 Surface Pro seems to have gotten off to a solid start, with just over nine percent of Surface-branded systems being this latest model. Surface Laptop, by contrast, appears to hold a much lower share at just two percent. AdDuplex’s numbers are driven by usage of apps from the Store. Part of this difference in share is likely to be due to different user behaviors; on the one hand, tablet users are probably more inclined to be interested in Store apps, as Store apps are more likely to be accessible to tablet systems. On the other hand, the Laptop defaults to Windows 10 S, which can only use Store apps (though it can be freely upgraded to Windows 10 Pro, which has no such restriction).

8fe86_windows-on-all-devices-640x273 Windows 10 now on 600 million machines, not all of them PCs

Which effect is more significant is hard to say, but either way, it continues to appear that people who want the Surface brand want the flexibility that Surface Pro boasts. That only further cements our confusion that the Laptop didn’t include a 360-degree hinge.

600 million machines are now running Windows 10

Microsoft is providing its second Windows 10 statistics update this year, announcing that 600 million devices are now running the company’s latest operating system. The number includes PCs, tablets, Xbox One consoles, HoloLens headsets, and even Surface Hub devices and phones. GeekWire reports that Microsoft CEO Satya Nadella revealed the latest figures during the company’s annual shareholders meeting today.

Microsoft had originally planned to get Windows 10 running on 1 billion devices by 2018. The software giant eventually gave up on that ambitious goal last year, and admitted it will take longer to hit the 1 billion target. How much longer is still unclear, as growth has stalled ever since Microsoft withdrew its free Windows 10 upgrade offer. Microsoft revealed its 500 million figure earlier this year, and the company confirmed that it has given up on Windows Phone recently. There’s now little hope that phones will ever contribute a large amount of aim of a billion devices running Windows 10.

Microsoft still appears to be on target to hit its revised milestone figures though. Sources told The Verge earlier this year that Microsoft was targeting 550 million active Windows 10 devices by the end of June, and 575 million by the end of September. Hitting 600 million by the end of November fits this new plan, but it’s not clear exactly when the big billion target will ever be met.

Two Incredible New Quantum Machines Have Made Actual Science Discoveries

7e599_rlzj007ndaxadagecrw0 Two Incredible New Quantum Machines Have Made Actual Science Discoveries
Image: Jeff Keyzer/Wikimedia Commons

There’s a nebulous concept that’s floating around the public conscious, called quantum advantage or quantum supremacy. One of these days, someone is going to boldly declare that they’ve created a quantum computer that can solve some complex problem that a regular computer can’t.

That said, quantum supremacy probably won’t be a single event. More likely it will be a slow process, beginning with a specialized quantum computer solving an incredibly esoteric problem, then progressing to increasingly important problems. While they’re not touting “quantum supremacy” officially, two teams of scientists are announcing that their quantum simulators—advanced quantum computers with very specialized scientific purposes—have made some real scientific discoveries.

Advertisement

“In a way, we already have entered the regime of quantum supremacy,” Mikhail Lukin, physics professor from Harvard told Gizmodo. “What we report in our work is really one of the first discoveries made with a quantum machine.”

Brief quantum computing explanation: A computer is a machine that solves problems by manipulating a huge number of bits, physical systems with two possible options like an on-off switch. A qubit is as if that switch was on or off with some probability while it’s performing a calculation—it’s both on and off at the same time. That switch (or qubit) then takes on a fixed value once the user looks at it. The quantum computer imparts the switch with the probabilities, then solves problems by making the switches talk to one another, kind of like tying them together.

There are several ways to represent these switches—all you need is a tiny system that obeys the rules of quantum mechanics with two possible states. Companies like IBM and Google are pursuing specially-fabricated, ultra-cold superconducting electronic systems. These two new teams instead trap atoms with a system of lasers. They can use the lasers to assign the specific properties to the atoms, then allow the system to change over time to simulate some problem. Each trapped atom is a qubit, and in this case, one team at Maryland created a different system with 53 qubits, and the other at Harvard, MIT and CalTech created one based on a different physical principle with 51 qubits, according to a pair of papers published today in Nature.

Advertisement

Both systems use trapped atoms, but the way each represents the two potential qubit states are different. In the Harvard/MIT/CalTech machine, the first state is an atom with an electron close to the center, the nucleus, while the second is an atom with a very far away electron. The Maryland one relies on ions (an atom missing electrons), their spins (an innate property whose equations look a lot like those of real world spinning), and an added force supplied by the lasers.

These aren’t the general quantum computers that some think may one day crack the mechanism used to encrypt your passwords. They’re very specific quantum simulators with very specific functions. “These are esoteric problems that we have solved,” said Christopher Monroe, physics professor at the University of Maryland. “In our case we mapped out a phase diagram [how the properties of the system change based on the inputs] of a toy model of magnetism.” Lukin’s team modeled how heat spreads through a specific kind of atomic system. “At the time, what we observed was completely unexpected,” he said.

51 and 53 qubits is definitely the lead for these types of systems. These trapped atom systems stay coherent for longer than the kind of quantum computers that IBM and Google are working on, meaning they take longer for their qubits to collapse into regular bits. But there are other factors when it comes to discussing how good a quantum computer is, including how easy they are to scale up and how much control there is over the qubits and which other qubits they talk to. Researchers everywhere are looking to improve all of these aspects.

This announcement is still a big deal on several of those fronts. “They’re an important step in the development of quantum technologies, in particular for quantum simulation,” Christine Muschik from the Institute for Quantum Computing at the University of Waterloo in Canada told Gizmodo. “These two experiments really mark an achievement where one has a fairly large number of qubits and at the same time, able to control them pretty well.”

It will take a long time before more general quantum computers turn up. Because of the short coherence times, it can take something like a thousand physical qubits to represent a single qubit resistant to errors, the kind that would be used to make calculations. For now, these computers are best suited for physical simulations and optimization problems, explained Lukin.

Ultimately Monroe (who co-founded a startup called IonQ) thinks companies like Google and IBM are overlooking these trapped atom systems. But he thinks that there will be room for several different kinds of physical quantum computers. “There’s room for CDs, magnetic hard drives and tapes,” he said. “I think the same will happen in quantum computing.

[Nature, Nature]

How this museum makes moldy machines work again, saving historic computers for the future

d5e3b_LathCarlson1-630x355 How this museum makes moldy machines work again, saving historic computers for the future
Living Computers: Museum + Labs Executive Director Lath Carlson. (GeekWire photo / Clare McGrane)

It came from a garage in North Carolina.

“We’d been looking for many years for an IBM 360,” Lath Carlson explained. “A gentleman had passed away and … we bought it sight unseen. It was so rare that when it popped up we wanted it immediately.”

But the executive director of Living Computers: Museum + Labs said the classic 1968 IBM 360/30 mainframe computer came with a multitude of unexpected surprises after sitting in a garage for some two decades. It had, Carlson said, been “getting progressively moldier and moldier and moldier … we actually had to have it specially sealed up to remediate all the mold in it. All the manuals that came with it, every page of every manual had to be vacuumed for all the mold spores on it.”

Now, Carlson said, the IBM mainframe boots up. And has a place of honor inside the carefully air-conditioned ‘cold room’ of the Seattle institution, said to be the only museum in the United States dedicated to both displaying and operating vintage computers.

d5e3b_LathCarlson1-630x355 How this museum makes moldy machines work again, saving historic computers for the future
Now mold-free, an historic IBM 360/30 mainframe. (GeekWire photo / Clare McGrane)

Living Computers was originally created by Microsoft co-founder Paul Allen as a by-appointment-only collection of historically significant computers, from the 1960s to the present. But in October 2012, the south-of-downtown Seattle location opened to the general public, with an emphasis, according to its website, on the “world’s largest collection of fully restored — and usable — supercomputers, mainframes, minicomputers and microcomputers.”

Allen himself wrote that one objective is to recognize “the efforts of those creative engineers who made some of the early breakthroughs in interactive computing that changed the world.” To that end, Living Computers has its own team of engineers that revitalize computers so visitors can experience how they work.

The museum’s Carlson joined GeekWire for an episode of our special podcast series on pop culture, science fiction, and the arts to walk through Living Computers’ two floors of hands-on exhibits, five years after the public unveiling. We discussed some of the stories behind the computers representing our digital heritage, as well as a main floor of lively, interactive displays of newer developments such as virtual reality and self-driving cars that preview tech’s future.

Listen here or download the MP3.

There is a lot unique inside Living Computers. The historic computing systems on its upper floor run the gamut from room-filling “big iron” mainframes like the IBM, to minicomputers (so named, Carlson said, not because they were that much smaller than mainframes, but because they generally could run on office power and cooling), to “microcomputers” — today’s personal computers. They provide both a window into each era’s tech, and sometimes its society.

d5e3b_LathCarlson1-630x355 How this museum makes moldy machines work again, saving historic computers for the future
GeekWire’s Frank Catalano and Living Computers’ Lath Carlson. (GeekWire photo / Clare McGrane)

Take the Digital Equipment Corporation PDP-7 minicomputer, introduced in 1964 and which Carlson said is the only one that’s still running in the world. With a photo of a pipe-puffing operator nearby. “A lot of the mainframes and minis built in the 1960s and ’70s had ashtrays built into the counters,” Carlson said. “You’d be sitting there working on the machine and, of course, you’re smoking a cigarette or a pipe and you need an ashtray.”

Or the Apple 1, one of about two hundred that Apple co-founders Steve Jobs and Steve Wozniak originally made. Carlson said theirs is “the only one that’s regularly operated.” Visitors who try it, though, may be surprised by the computer’s case. There is none, only protective Plexiglas around a circuit board. “You just got the bare board; you had to supply your own monitor, keyboard, and everything,” Carlson said. All for a mere $666.66 in 1976 dollars.

d5e3b_LathCarlson1-630x355 How this museum makes moldy machines work again, saving historic computers for the future
Gates’ and Allen’s original Traf-O-Data (GeekWire photo / Clare McGrane)

Also unique from around the same time: a device the size of a large microwave oven called the Traf-O-Data. While the computer might not be familiar to many, its creators would be. “This is Paul Allen and Bill Gates’ first company,” Carlson said. “They started this in high school. This particular computer was built in a dorm room at U.W. … we have the only version they ever made, here in the museum.” (Allen, perhaps not coincidentally, is Living Computers’ founder.)

Yet Carlson says the most popular computer for visitors isn’t a one-of-a-kind. As a matter of fact, it has a reputation for telling its many users, “You have died of dysentery.”

“The Apple II really gets noticed a lot,” Carlson said. “I think that has to do with people recognize The Oregon Trail running on it, which is the most popular piece of software that we have here. And they recognize the look of it, mostly because they were used so much in computer labs and middle schools and high schools in the 1980s and into the 1990s.”

d5e3b_LathCarlson1-630x355 How this museum makes moldy machines work again, saving historic computers for the future
School fixture Apple IIe and The Oregon Trail. (GeekWire photo / Frank Catalano)

Living Computers’ critical mass of operating vintage computers, tied to its stated mission “to maintain running computer systems of historical importance,” has also made it a valuable go-to resource for other organizations.

“Something that’s becoming a frequent request for us is that somebody that has software on a format that’s no longer readable by machines that they have, including people like NASA, coming to us going, ‘Hey, we have these things on IBM tape. We have no way to read it. Can you read it?’” Carlson said. “Because in a lot of cases we’re the only people in the entire world that has the operating hardware to read those old media formats.”

d5e3b_LathCarlson1-630x355 How this museum makes moldy machines work again, saving historic computers for the future
Lively exhibits of computing’s future on the main floor. (GeekWire photo / Dan DeLong)

Those requests represent a cautionary tale for today’s individual hoarders of old technology, too. “This is an under-appreciated global problem that we have,” Carlson said. It’s not just that the information is in unreadable formats. “They have it on a media that’s actually physically falling apart. So if you have old CDs, DVDs, Jaz drives, floppies, there’s a high likelihood that whatever you think is on there is actually gone forever,” he said.

His advice? “Load your stuff to the cloud because it may not be readable very soon.” Carlson only has to go to an earlier generation of data storage for horror stories. “A lot of the old magnetic tape and especially the disk packs, the physical particles on the disks are falling off,” he said. “A lot of times when we go to read it, when we spin it up to speed to read it, all the particles will fly off and there’s nothing left.”

d5e3b_LathCarlson1-630x355 How this museum makes moldy machines work again, saving historic computers for the future
Unlike magnetic media, paper punch cards persist. (GeekWire photo / Clare McGrane)

It’s that kind of experience that appears to take Carlson full circle to continue to respect the durability of some of the oldest systems in Living Computers’ decades-deep collection. “Fortunately, a lot of our machines can run off of paper tape,” he said. “Paper tape or punch cards actually hold up just fine because it’s literally physical holes punched in paper.”

These are the kinds of insights a museum that just displays static, non-working technology is unlikely to be able to have. “We really focus on running the machines,” Carlson said, “and not just collecting.”

Podcast production and editing by Clare McGrane.

 Previously in this series: Public radio’s digital moment: Smartphones, streaming, and the future of listening

Linux Containers vs Virtual Machines – Datamation

Ever since containers on Linux became popular, determining the difference between Linux containers and virtual machines has become trickier. This article will provide you with the details to understand the differences between Linux containers and virtual machines.

Linux Containers vs Virtual Machines – Applications vs Operating Systems

One of the first things to understand about containers vs virtual machines is that one is used for applications and the other is designed for entire operating systems. This is why you’ll oftentimes see some enterprise applications running in a container instead of its own virtual machine. There are some interesting advantages to using a container over a virtual machine.

One of the biggest advantages to a container is the fact you can set aside less resources per container than you might per virtual machine. Keep in mind, container are essentially for a single application while virtual machines need resources to run an entire operating system.

To make this even simpler, consider the following. If you need to run multiple instances of MySQL, NGINX, or other services, using containers makes a lot of sense. If however you need a full LAMP stack running on its own server, there is a lot to be said for running a virtual machine. A virtual machine gives you greater flexibility to choose your operating system and upgrade it as you see fit. A container by contrast, means that the container running the configured application is isolated in terms of OS upgrades from the host.

Linux Containers vs Virtual Machines – Use Case Scenarios

One of my favorite examples where using a container makes the most sense is with Linux library versions. For example, let’s say you have a mission critical application that requires a specific version of Python. Then you run updates on the box housing the application and suddenly, that Python version changes, rendering said application non-functional.

Another key benefit to using containers is the idea that you can take an application, put it into a container and run it on any OS that supports the container type you’re running. One example of this is when you want to have an application that runs on multiple deployments using different Linux distros. By using a container, you can run a similar application environment on a variety of different distros. Containers provide portability.

One could even make the argument that containers make more sense for speedy cross-distro deployments whereas virtual machines make sense for single application use situations like running a LAMP stack.

Linux Containers vs Virtual Machines – Security

It’s widely accepted that virtual machines offer a bit more in terms of security when compared to containers. This isn’t to say that containers can’t be secured, rather, to suggest that by default virtual machines offer greater isolation overall. Remember, containers share system resources that virtual machines do not.

Some things you can do to minimize risk when running containers include avoiding superuser privileges, making sure the containers are obtained from trusted sources and of course, kept up to date. Thankfully, some containers are digitally signed which helps to determine that you’re getting a container from a trusted source.

Lastly, you need to keep a container to single function duty. Once you start combining software duties under one container, you’ll find you’re better off using a virtual machine instead. To reiterate, containers are for single purpose applications and virtual machines are for multiple purpose applications. Stick to this methodology and you’ll be in a far better situation with both security and overall functionality.

Linux Containers vs Virtual Machines – Selecting the Right Tools

Regardless of the virtual machine or container type, the key to choosing the right one for your needs comes down to researching the abilities of each option. In the container realm, Docker offers a strong enterprise solution. This appeals to companies looking at Docker containers because companies know that they can get the support they need if anything comes up. Docker is also considered to be an enterprise consumer friendly option, especially in Docker Swarms. Comparing Docker Swarms to Kubernetes, it’s widely accepted that Kubernetes is far more complex to set up in advanced environments.

Back on the virtual machine front, I’ve always found that Virtualbox makes for a greater desktop oriented virtual machine environment whereas VMware does a splendid job on the server side of things with their various offerings. VMware has virtual machine solutions ranging from storage to cloud server solutions. There are other options available (various hypervisors, etc), however I think that VirtualBox and VMware represent the two realms of desktop to server virtualization nicely.

Linux Containers vs Virtual Machines – And the Winner Is?

Before trying to decide between a container or a virtual machine, consider the following. You can run containers on VMs should you choose to. Understanding this is important as there are no clear winners or losers here. In fact, the two technologies serve completely different needs.

Containers will continue to see the bulk of the spotlight in the press as they allow their users to run more efficiently with less hardware. On the flip side of things, virtual machines remain a staple in the server and cloud space. Suffice it to say, virtualization isn’t going anywhere and is just as hot as containers.

The one area that I think is worth watching is seeing how things play out between Kubernetes and Docker Swarms. I think it’ll be interesting to see how these two container management technologies playout and which one of these technologies becomes the standard. For awhile, it felt like it would be Docker. These days however, we’re seeing a lot more from the Kubernetes camp. It’s entirely possible that over time we will begin to see Kubernetes grabbing headlines and becoming the top player in the container camp.

What say you? Do you believe that containers are on track to outpace virtual machines? Perhaps instead you believe a combination of the two technologies are where things are headed? Whatever the case may be, hit the comments and let’s hear your view. Are you running a Plex container at home while utilizing the power of a full virtualized environment at work? I’d love to hear about your experiences with these technologies.

Linux Containers vs Virtual Machines

Ever since containers on Linux became popular, determining the difference between Linux containers and virtual machines has become trickier. This article will provide you with the details to understand the differences between Linux containers and virtual machines.

Linux Containers vs Virtual Machines – Applications vs Operating Systems

One of the first things to understand about containers vs virtual machines is that one is used for applications and the other is designed for entire operating systems. This is why you’ll oftentimes see some enterprise applications running in a container instead of its own virtual machine. There are some interesting advantages to using a container over a virtual machine.

One of the biggest advantages to a container is the fact you can set aside less resources per container than you might per virtual machine. Keep in mind, container are essentially for a single application while virtual machines need resources to run an entire operating system.

To make this even simpler, consider the following. If you need to run multiple instances of MySQL, NGINX, or other services, using containers makes a lot of sense. If however you need a full LAMP stack running on its own server, there is a lot to be said for running a virtual machine. A virtual machine gives you greater flexibility to choose your operating system and upgrade it as you see fit. A container by contrast, means that the container running the configured application is isolated in terms of OS upgrades from the host.

Linux Containers vs Virtual Machines – Use Case Scenarios

One of my favorite examples where using a container makes the most sense is with Linux library versions. For example, let’s say you have a mission critical application that requires a specific version of Python. Then you run updates on the box housing the application and suddenly, that Python version changes, rendering said application non-functional.

Another key benefit to using containers is the idea that you can take an application, put it into a container and run it on any OS that supports the container type you’re running. One example of this is when you want to have an application that runs on multiple deployments using different Linux distros. By using a container, you can run a similar application environment on a variety of different distros. Containers provide portability.

One could even make the argument that containers make more sense for speedy cross-distro deployments whereas virtual machines make sense for single application use situations like running a LAMP stack.

Linux Containers vs Virtual Machines – Security

It’s widely accepted that virtual machines offer a bit more in terms of security when compared to containers. This isn’t to say that containers can’t be secured, rather, to suggest that by default virtual machines offer greater isolation overall. Remember, containers share system resources that virtual machines do not.

Some things you can do to minimize risk when running containers include avoiding superuser privileges, making sure the containers are obtained from trusted sources and of course, kept up to date. Thankfully, some containers are digitally signed which helps to determine that you’re getting a container from a trusted source.

Lastly, you need to keep a container to single function duty. Once you start combining software duties under one container, you’ll find you’re better off using a virtual machine instead. To reiterate, containers are for single purpose applications and virtual machines are for multiple purpose applications. Stick to this methodology and you’ll be in a far better situation with both security and overall functionality.

Linux Containers vs Virtual Machines – Selecting the Right Tools

Regardless of the virtual machine or container type, the key to choosing the right one for your needs comes down to researching the abilities of each option. In the container realm, Docker offers a strong enterprise solution. This appeals to companies looking at Docker containers because companies know that they can get the support they need if anything comes up. Docker is also considered to be an enterprise consumer friendly option, especially in Docker Swarms. Comparing Docker Swarms to Kubernetes, it’s widely accepted that Kubernetes is far more complex to set up in advanced environments.

Back on the virtual machine front, I’ve always found that Virtualbox makes for a greater desktop oriented virtual machine environment whereas VMware does a splendid job on the server side of things with their various offerings. VMware has virtual machine solutions ranging from storage to cloud server solutions. There are other options available (various hypervisors, etc), however I think that VirtualBox and VMware represent the two realms of desktop to server virtualization nicely.

Linux Containers vs Virtual Machines – And the Winner Is?

Before trying to decide between a container or a virtual machine, consider the following. You can run containers on VMs should you choose to. Understanding this is important as there are no clear winners or losers here. In fact, the two technologies serve completely different needs.

Containers will continue to see the bulk of the spotlight in the press as they allow their users to run more efficiently with less hardware. On the flip side of things, virtual machines remain a staple in the server and cloud space. Suffice it to say, virtualization isn’t going anywhere and is just as hot as containers.

The one area that I think is worth watching is seeing how things play out between Kubernetes and Docker Swarms. I think it’ll be interesting to see how these two container management technologies playout and which one of these technologies becomes the standard. For awhile, it felt like it would be Docker. These days however, we’re seeing a lot more from the Kubernetes camp. It’s entirely possible that over time we will begin to see Kubernetes grabbing headlines and becoming the top player in the container camp.

What say you? Do you believe that containers are on track to outpace virtual machines? Perhaps instead you believe a combination of the two technologies are where things are headed? Whatever the case may be, hit the comments and let’s hear your view. Are you running a Plex container at home while utilizing the power of a full virtualized environment at work? I’d love to hear about your experiences with these technologies.

5 New & Powerful Dell Linux Machines You Can Buy Right Now

Brought to you by

5 New & Powerful Dell Linux Machines You Can Buy Right Now

The land of powerful PCs and workstations isn’t barren anymore when we talk about Linux-powered machines; even all of the world’s top 500 supercomputers now run Linux.

Dell has joined hands with Canonical Inc. to give Linux-powered machines a push in the market. They have launched five new Canonical-certified workstations running Ubuntu Linux out-of-the-box as a part of the Dell Precision series. An advantage of buying these canonical-certified machines is that the users won’t have to worry about incompatibility with Linux.

5 Linux PCs you need to check out

Check out the specifications page of these Linux machines:

Dell Precision 5720

c6a7f_download-1 5 New & Powerful Dell Linux Machines You Can Buy Right Now

Price: $1,499 (Starting)

It is a 27-inch AIO workstation which offers Ubuntu 16.04 LTS as an option along with RHEL WS v7.0 and Windows 10. For CPU options, you can choose from a number of Intel Xeon, 6th Gen, and 7th Core processors with added benefits of AMD Radeon Pro WX graphics chips.

Dell Precision 7720

c6a7f_download-1 5 New & Powerful Dell Linux Machines You Can Buy Right Now

Price: $1,379 – $4,849

Dell Precision 7720 is a 17-inch (1600×900 LED-backlit) laptop which comes in four configurations with the base model featuring Intel Core i5-7300HQ and the top one, priced $4,849, packing an Intel Xeon E3-1505M v6. The memory options for this Dell linux laptop range between 8GB and 64GB, whereas, the graphics department for the highest models is handled by Nvidia Quadro P4000 8GB GDDR5.

Dell Precision 7520

Price: $1,049 – $3,149

The smaller version of the Precision 7720 is the Dell Precision 7520 with a 15-inch FHD display you find on most laptops. For this Linux machine, the base model comes with Intel Core i5-7300 HQ and the Intel Xeon E3-1505M v6 finds its home on the top model. Dell has managed to reduce prices in the graphics department. It offers Nvidia Quadro M2200 4GB GDDR5 graphics processor for the top model. The RAM can be bumped to 32GB DDR4.

Dell Precision 5520

Another Dell Linux laptop series in line is the Precision 5520 which comes with Intel Core i5-7440HQ (8GB, 500 HDD) for the base model and an Intel Xeon E3-1505M v6 (32GB, 512GB SSD) for the highest price. The series features 15-inch FHD display and Nvidia Quadro M1200 GPU on the top model.

Dell Precision 3520

Price: $879 – $1,999

Last, there is 15-inch Precision 3520 series which is the cheapest. The base model comes with an Intel Core i5-7440HQ (4GB, 500GB HDD). You will notice cost cutting in the screen resolution which is limited to 1366×768 pixels. But you have graphics options from Nvidia.

It great to see that we can now find powerful Linux machines. In fact, I probably haven’t seen a Linux-powered computer that touches $5,000 price tag. These machines might not include the latest beasts from Intel and AMD, but it certainly a lot better than what Linux users got in the past.

You can check out the certification details for these Linux laptops from Dell in Ubuntu’s post using this link.

What are your views on these Dell Linux laptops? Drop your thoughts in the comments.

Also Read: Open Source Pioneer Munich Votes To Move All Remaining Linux PCs To Windows 10 In 2020

Detroit Election Day: Computer glitches, faulty voting machines reported

Why vote?
Kathy Kielisczewski

Linux Is Powering Almost Half Of All Microsoft Azure Virtual Machines

Around 40% of Microsoft’s Azure VMs now run Linux distributions, according to a tweet made by Microsoft Developer UK twitter account, ZDNet reports. That information was retweeted by Linuxing community manager Brian Byrne.

However, the numbers seem to account for the roughly 7% rise since last year, in June 2016, when Microsoft reported that one out of every three Azure VMs is running Linux.

Users of Microsoft’s popular cloud computing service can choose between various supported Linux distributions like CentOS, CoreOS, Debian, Oracle Linux, RHEL, SUSE Linux Enterprise, openSUSE, and Ubuntu.

Earlier this year, the company also added support for Intel’s Clear Linux. And now, users can also load Kali Linux on their Azure virtual machines.

What are your views on this? Drop them in the comments.

Also Read: Who Contributes To Linux Kernel? How Has Its Development Evolved In Past 26 Years?



Advertise here