Showing posts with label computering. Show all posts
Showing posts with label computering. Show all posts

Friday, 17 October 2014

Your next smartphone or EV will recharge to 70% in just two minutes, thanks to new lithium-ion battery tech

fast-charging batteries

Scientists at Nangyang Technological University (NTU) in Singapore have developed a new lithium-ion battery that can be recharged to 70% capacity in just two minutes. In addition to being able to charge your smartphone or electric car in just a few minutes, this new lithium-ion battery (LIB) can also endure more than 10,000 charge/discharge cycles — or about 20 times more than current LIBs. Perhaps most excitingly, though, NTU’s new Li-ion tech has already been patented, is compatible with existing battery manufacturing processes, and has “attracted interest from the industry.” Unlike many other lithium-ion battery advances, this one might actually hit the market within a couple of years.
As you’re probably aware, modern life (perhaps a little unnervingly or depressingly) is inextricably linked to batteries. How long a device lasts on battery power, and how long it takes to recharge, have a direct impact on most aspects of our work and social lives — and it’s only going to get worse as wearable computing, electric vehicles, and the internet of things take hold. While we do occasionally see incremental changes and improvements to battery technology, we are still mostly beholden to lithium-ion battery tech that was commercialized by Sony way back in 1991. NTU’s new lithium-ion battery design, which allows for ultra-fast recharging and extreme endurance, could be the big breakthrough that we desperately need.
Titanium dioxide (titania)
Titanium dioxide — otherwise known as that cheap white powder that’s used in paint, sunscreen, solar panels, and more.
NTU’s new battery, developed by Chen Xiaodong and friends, replaces the LIB’s customary graphite anode with a gel of titanium dioxide (TiO2) nanotubes. It’s news to me, but apparently titanium dioxide — a very cheap, plentiful substance that you might know as titania — is very good at storing lithium ions, and thus electrical charge. By using a nanostructured gel, the surface area of the anode — and thus its ability to quickly pick up lots and lots of lithium ions — is dramatically increased. The NTU research paper seems to mostly focus on the process used to create these titanium dioxide nanotubes. In short, though, they just stirred a mixture of titanium dioxide and sodium hydroxide — at at just the right temperature, the stirring encourages the TiO2 to form long nanotubes. Suffice it to say, this simple process is “easy to integrate” into current production processes. [DOI: 10.1002/adma.201470238 - "Nanotubes: Mechanical Force-Driven Growth of Elongated Bending TiO2-based Nanotubular Materials for Ultrafast Rechargeable Lithium Ion Batteries"]
Read: Stanford creates ‘Holy Grail’ lithium battery, could triple smartphone and EV battery life
Currently, one of the biggest problems of lithium-ion batteries is that they can’t be charged very quickly. By replacing the graphite anode with NTU’s titanium dioxide gel, the researchers say they’ve created LIBs that can be recharged to 70% capacity in just two minutes. Furthermore, because the new gel is much more resistant to microfracturing and dendrite formation, the new batteries have extreme endurance of over 10,000 charge/discharge cycles — about 20 times more than current LIBs. This second feature is obviously big news for electric vehicles like the Tesla Model S, which will need a costly replacement battery pack every few years. “With our nanotechnology, electric cars would be able to increase their range dramatically with just five minutes of charging, which is on par with the time needed to pump petrol for current cars,” says Chen. The first feature — ultra-fast recharging — is awesome news for just about everyone.
NTU Singapore says the new LIB technology has already been patented (presumably the method of making TiO2 nanotubes), and has attracted interest from the industry. Chen says the first generation of fast-charging batteries should hit the market within two years. In the meantime, software-based fast charging solutions and power-saving modes should keep us out of the electro-mobility chasm for a little longer.

Sunday, 15 June 2014

D-Wave confirmed as the first real quantum computer by new research


D-Wave 2


Ever since D-Wave arrived on the scene with a type of quantum computer capable of performing a problem-solving process called annealing, questions have flown thick and fast over whether or not the system really functioned — and, if it did function, whether it was actually performing quantum computing. A new paper by researchers who have spent time with the D-Wave system appears to virtually settle this question — the D-Wave system appears to actually perform quantum annealing. It would therefore be the first real quantum computer.
Up until now, it’s been theorized that D-Wave might be a simulator of a quantum computer based on some less-than-clear benchmark results. This new data seems to disprove that theory. Why? Because it shows evidence of entanglement. Quantum entanglement refers to a state in which two distinct qubits (two units of quantum information) become linked. If you measure the value of one entangled qubit as 0, its partner will also measure 0. Measure a 1 at the first qubit, and the second qubit will also contain a 1, with no evidence of communication between them.
Researchers working with a D-Wave system have now illustrated that D-Wave qubit pairs become entangled, as did an entire set of eight qubits. (The D-Wave uses blocks of eight qubits, as shown below). [DOI: http://dx.doi.org/10.1103/PhysRevX.4.021041 - "Entanglement in a Quantum Annealing Processor"]
D-Wave 2
The D-Wave 2 Vesuvius chip, with 512 qubits
Assuming the experimental evidence holds up, this fundamentally shifts the burden of proof from “Prove D-Wave is quantum,” to “Prove the D-Wave isn’t quantum.” Evidence of entanglement is the gold standard for whether or not a system is actually performing quantum computing.

So, now what?

Now that we have confirmation that D-Wave is a quantum computer (or at least, as close to confirmation as we can likely get), the question is, how do we improve it? As we’ve previously covered, the D-Wave isn’t always faster than a well-tuned classical system. Instead of arguing over whether or not an Nvidia Tesla GPU cluster with customized software is a better or worse investment than a supercomputer that’s cryogenically cooled and computes via niobium loops, we’re going to look at what D-Wave needs to do to improve the capabilities of its own system. As Ars Technica points out, its architecture is less than ideal — for some problems, D-Wave can only offer less than 100 effective qubits despite some newer systems having 512 qubits in total, because its architecture is only sparsely connected. Each group of eight qubits connects to itself, but each island of eight qubits has just eight connections to two other adjacent qubits.
The D-Wave Two's cryogenic cooling system. There's a qubit chip in there, somewhere.
The D-Wave Two’s cryogenic cooling system. There’s a qubit chip in there, somewhere.
D-Wave has stated that it intends to continue increasing the number of qubits it offers in a system, but we can’t help wondering if the company would see better performance if it managed to scale up the number of interconnects between the qubit islands. A quantum system with 512 qubits but more than just two connections to other islands might allow for much more efficient problem modeling and better overall performance.
Inevitably this kind of questioning turns to the topic of when we’ll see this kind of technology in common usage — but the answer, for now, is “you won’t.” There are a number of reasons why quantum computing may never revolutionize personal computing, many of them related to the fact that it relies on large amounts of liquid nitrogen. According to D-Wave’s documents for initial deployments, its first systems in 2010 required 140L of LN2 to initially fill and boiled off about 3L of fluid a day. Total tank capacity was 38L, which required twice-weekly fill-ups. The Elan2 LN2 production system is designed to produce liquid nitrogen in an office setting and can apparently create about 5L of LN2 per day at an initial cost of $9500. [Read: Google’s Quantum Computing Playground turns your PC into a quantum computer.]
Did I mention that you have to pay attention to Earth’s magnetic field when installing a D-Wave system, the early systems created about 75dB of noise, and it weighs 11,000 pounds? Many of these issues confronted early computers as well, but the LN2 issue is critical — quantum computing, for now, requires such temperatures — and unless we can figure out a way to bring these systems up to something like ambient air temperature, they’ll never fly for personal use. Rest assured that lots of research is being done on the topic of room-temperature qubits, though!

Saturday, 7 June 2014

The History of Computers

history of computers
"Who invented the computer?" is not a question with a simple answer. The real answer is that many inventors contributed to the history of computers and that a computer is a complex piece of machinery made up of many parts, each of which can be considered a separate invention.
This series covers many of the major milestones in computer history (but not all of them) with a concentration on the history of personal home computers.



Computer History
Year/Enter
Computer History
Inventors/Inventions
Computer History
Description of Event
1936
Konrad Zuse - Z1 ComputerFirst freely programmable computer.
1942
John Atanasoff & Clifford Berry
ABC Computer
Who was first in the computing biz is not always as easy as ABC.
1944
Howard Aiken & Grace Hopper
Harvard Mark I Computer
The Harvard Mark 1 computer.
1946
John Presper Eckert & John W. Mauchly
ENIAC 1 Computer
20,000 vacuum tubes later...
1948
Frederic Williams & Tom Kilburn
Manchester Baby Computer & The Williams Tube
Baby and the Williams Tube turn on the memories.
1947/48
John Bardeen, Walter Brattain & Wiliam Shockley
The Transistor
No, a transistor is not a computer, but this invention greatly affected the history of computers.
1951
John Presper Eckert & John W. Mauchly
UNIVAC Computer
First commercial computer & able to pick presidential winners.
1953
International Business Machines
IBM 701 EDPM Computer
IBM enters into 'The History of Computers'.
1954
John Backus & IBM 
FORTRAN Computer Programming Language
The first successful high level programming language.
Stanford Research Institute, Bank of America, and General Electric
ERMA and MICR
The first bank industry computer - also MICR (magnetic ink character recognition) for reading checks.
1958
Jack Kilby & Robert Noyce
The Integrated Circuit
Otherwise known as 'The Chip'
1962
Steve Russell & MIT
Spacewar Computer Game
The first computer game invented.
1964
Douglas Engelbart
Computer Mouse & Windows
Nicknamed the mouse because the tail came out the end.
1969
ARPAnetThe original Internet.
1970
Intel 1103 Computer MemoryThe world's first available dynamic RAM chip.
1971
Faggin, Hoff & Mazor
Intel 4004 Computer Microprocessor
The first microprocessor.
1971
Alan Shugart &IBM
The "Floppy" Disk
Nicknamed the "Floppy" for its flexibility.
1973
Robert Metcalfe & Xerox
The Ethernet Computer Networking
Networking.
1974/75
Scelbi & Mark-8 Altair & IBM 5100 ComputersThe first consumer computers.
1976/77
Apple I, II & TRS-80 & Commodore Pet ComputersMore first consumer computers.
1978
Dan Bricklin & Bob Frankston
VisiCalc Spreadsheet Software
Any product that pays for itself in two weeks is a surefire winner.
1979
Seymour Rubenstein & Rob Barnaby
WordStar Software
Word Processors.
1981
IBM
The IBM PC - Home Computer
From an "Acorn" grows a personal computer revolution
1981
Microsoft
MS-DOS Computer Operating System
From "Quick And Dirty" comes the operating system of the century.
1983
Apple Lisa ComputerThe first home computer with a GUI, graphical user interface.
1984
Apple Macintosh ComputerThe more affordable home computer with a GUI.
1985
Microsoft WindowsMicrosoft begins the friendly war with Apple.
SERIES
TO BE
CONTINUED

Thursday, 27 March 2014

This is my rig: Bill Howard’s million-photo $10,000 digital photography workstation

1403241059_1059_myrig

The quest for my new rig started with the need for more speed and fewer crashes. No surprise there. I wanted to edit photos without the lag as Photoshop ambled to life and pulled a big photo off a slow hard drive, or the 20-second wait for Photoshop to blur the background of a photo. I’d do almost anything — buy almost any CPU or SSD — to avoid the annoying pauses that happen a hundreds of times a day.
I wound up building a rig with a six-core CPU on a workstation-class motherboard, dual SSDs and a four-drive RAID array, a single graphics card, and dual monitors. It has a keyboard so powerful it needs two USB cables, and then three scanners and an eight-bay backup server. The rig stores a million photos inside the system unit, twice that on a NAS.
Some of this may sound like overkill. It is and it isn’t. Here’s why.

Wanted: A PC that executes in the blink of an eye

Previously I was working with a four-year-old PC — upgraded but still slow — and a three-year-old iMac. A refresh was overdue. The main PC was painfully slow handling photos, worse as I’ve worked my way into video editing. GoPro video cameras and DSLRs may be small but they churn out a prodigious amount of data.
My goal was blazing fast speed for photos and secondarily for video. How fast? Intel founder Andy Grove said of benchmarks, “Fast is when something happens in the blink of an eye.” In other words, like obscenity, you’ll know fast when you see it. I wanted to be able to instantly flip through 100 full-screen images with no hesitation. I wanted to render a blurred background in five seconds not 15 or 20. I didn’t want to wait 45 minutes for Lightroom to import 1000 photos and create full-size previews.
A million photos takes up 6TB-8TB, so I wanted 8TB, minimum, of internal storage for direct access to all of them. I also wanted data redundancy on board, meaning some for of RAID. I wanted twice as much backup storage on a server. I wanted a system that stayed cool, a comfortable keyboard and mouse since I’m also a writer, and video conferencing for talking to clients, friends, and organizations that have discovered Skype and Google Hangouts.
My research included checking with tech-savvy friends in the industry and at ExtremeTech. Here are recommendations I heard:
  • SSD is mandatory. Get a second SSD as a hard disk cache or data drive, but you need an SSD for booting up.
  • Load up on RAM. With 16GB-32GB of RAM, no Photoshop project will spill out of RAM to a scratch disk and a cause of lag on hard drive systems.
  • For Photoshop, the number of processor cores matters more than CPU speed. Performance improves noticeably with 4-6 cores. Beyond six, price goes up faster than performance. Also, in addition to Intel’s Core i7, consider Xeon.
  • For Adobe software, a midrange graphics card works well. Just make sure it supports OpenGL.
  • Don’t skimp on backup. You need a couple external drives or a NAS with room to store at least twice the data you have today.
With that in mind, I went to work. I actually started building two rigs since I needed to update two systems. I built one near-ultimate system and one bang-for-the-buck system. (More on the second system at the end.)

CPU: Intel Core i7-3930K CPU

core-i7-3930k-box3My main apps, Photoshop and Lightroom, are heavily multithreaded, meaning the CPU can execute multiple tasks simultaneously within the same program. That means they benefit from lots of physical CPU cores. The newer Ivy Bridge and Haswell architectures focus on dual- and quad-core CPUs. So I went with a more established (read: older) CPU technology with six physical cores and 12 threads: the Intel Core i7-3930K Sandy Bridge-E using the LGA 2011 socket ($550). The K suffix means it’s unlocked and can be overclocked to 3.8GHz.
There was little need for the sibling Core i7-4960 for gamers, almost twice the price to step up from 3.2 to 3.6 GHz base clock speed. PCMag.com said in its Core i7-3930K review, “For all but the most rabid enthusiasts, Intel’s Core i7-3930K processor represents an unparalleled combination of price and performance for use on the X79 Express platform.” This fit with my belief that you often want to shop one model down from the top.
Atop the CPU, I mounted a Corsair H100i liquid CPU cooler ($110). Its dual-fan radiator replaces one of the case fans. The two hoses obstruct motherboard access less than a cheaper CPU-mounted fan but it’s tougher to mount. It uses an internal USB connector for power.

Disqus

comments powered by Disqus