New Technology

Get the latest technology news, articles. A look at the innovations and technologies that are shaping the future and changing the world.

Wednesday, 27 May 2015

Mac vs PC: which one to choose? -- A vlog

Welcome back ladies and gents! We are very excited to bring you a special post today -- a 
vlog on 'Mac vs PC' made by four of us. In this video we briefly discussed the advantages and disadvantages for both Mac and Windows computers, and provided our opinions on which one is suitable for you. Below is our video, hope you find it enjoyable!



And that's our homemade video. Have you made up your mind after watching it? Still not? Don't worry. We have also came up with a summary of the video contents as well as some more information that might be useful to you.

Is Mac's Hardware Worth the High Price?

Many people feel the cost of a Mac is too high for what you get, but others justify the price tag by citing top-tier support, higher hardware build quality, and the benefits of the Apple ecosystem. Whether or not you feel those points are reasonable, Mac hardware does come with a number of limitations when it comes to specifications. If you want a machine you can upgrade beyond the drive, RAM, or sometimes anything at all, it'll cost you $2,499 for a Mac Pro.

Windows Has More Software

The Windows Store already has over 50,000 apps despite its youth whereas the Mac App Store had a little under 14,000 at the end of its second year.

Macs Have Fewer Viruses and Require Less Maintenance

People used to argue that Macs could not be infected by viruses. Although now it has been proven that statement is not true, some still have enough confidence in Mac. Nevertheless, far fewer viruses exist for Mac and most Mac users get by just fine without any antivirus protection. On the other hand, Windows does suffer from more than just a few security exploits.

Windows Offers a Better Gaming Experience

Apple hardware offers a paltry selection of graphics cards. While you'll find more and more popular games including several unique titles available for OS X, if you want a bleeding edge gaming experience you won't get it from a Mac. You are fairly limited in regards to what you can do with it when running OS X. Oftentimes, the same game will just run better in Windows than it will in OS X.
It is safe to simply say that, for heavy tasks go for a Windows PC, for day to day use go for a Mac.

Thank you all,
From,
Sakib, Alex, Claire & Nina

Saturday, 23 May 2015

Wireless electricity- live life, wireless...

Your phone just notified that the battery is dying and you just got comfortable with your sleeping position but u need your phone fully charged for tomorrow; what should you do now? We face these kind of situations every now and again, and as a result we have to sacrifice either the comfortable sleeping position or the feeling that your phone is fully charged for the next day. Sacrifice no more, because wireless electricity is on its way to the rescue.


No, I don’t mean those inductive charging pads, when I say “wireless” I mean wireless! Those pads are really not wireless, are they? The transmission of electrical power without using solid wires or conductors is known as wireless power transfer. Technology has leaped so far ahead that it is now possible to provide power to electrical devices over the air (wirelessly) within a set range, however, the technology is still fairly new.





So how does it work?
Transmission of energy to a receiver from a transmitter through oscillating magnetic field. The is achieved by – the power source supplies direct current, which is then converted to alternating current by the help of specially designed electronics constructed into the transmitter. A copper wire coil placed inside the transmitter is energised by the alternating current, once the coil placed in the receiver is within the proximity of the magnetic field, alternating current is induced in the receiving coil by the field. This alternating current is converted back to direct current by the electronics and hence it becomes usable power.


People may be concerned about the safety issues, as this will mean that electricity will be going through our bodies; however, specialists have already replied to that saying, it’s not dangerous at all because it works the same way as our Earth does, using magnetic fields. In some cases it may even be safer when compared to wired technologies.


So, do you think wireless technology is the way to power and charge out electronics in the future? I certainly think so!

Wednesday, 20 May 2015

G-Sync: The way it's meant to be played

This blog is a bit special this time, because we are going to step into the "nerdy world" and take a look at the mind-blowing G-Sync technology from Nvidia. As a gamer (aka. nerd) myself, I am pretty excited to bring you guys the frontier of gaming experience.
Before jumping into G-Sync, let me explain an issue that was unsolved for a million years before the existence of G-Sync. When you play games, the action happening on the screen is made up of frames(images). The frames are measured at 'Hz', 1Hz can be seen as 1 frame per second(FPS). Your monitor is a device that has a built-in, fixed refresh rate. Today approximately 90% gamers are playing on a 1920 x 1080 (full HD) 60Hz display. The graphics card is always firing of frames as fast as it can possibly do, and FPS is dynamic and can bounce from say 30 to 80 FPS in an matter of split seconds. Obviously there must be a conflict between fixed and dynamic. So at the same time we have the graphics card rendering at a continuously changing frame rate while the monitor refreshes at 60 frames per second. Then there is a problem, as with a slower or faster FPS than 60, you'll get multiple images displayed on the screen per refresh of the monitor. As a result, you will get screen tearing, which is really annoying and ruins your gaming experience. Before G-Sync was invented, the old solution is called 'V-Sync', vertical synchronize. It forces the graphics card render at the same refresh rate as the monitor. The screen tearing problem might be gone, but lag occurs between the graphics card input and monitor output, which causes screen stuttering.

Nvidia G-Sync module
Today thanks to Nvidia's G-Sync technology, this problem is solved for good. G-Sync is both a software and a hardware solution that will eliminate screen tearing and stuttering. A daughter hardware board is placed into a G-Sync enabled monitor. With G-Sync the monitor will become a 'slave' to your graphics card as the its refresh rate in Hz becomes dynamic. So each time your graphics card has rendered one frame that frame is aligned up with the monitor refresh rate. Therefore the refresh rate of the monitor will become dynamic. With both the graphics card and monitor dynamically in sync with each other, the tearing and stuttering are gone for good.
It gets even better, currently there are G-Sync monitors on the market that have 144Hz refresh rate -- yes, the screen is able to actually refresh 144 images per second. What this means it that with the higher refresh rate comparing with traditional 60Hz, it gives you a much more smooth gaming experience. Your eyes will feel extremely comfortable while playing in front of one of this panel. And you won't feel dizzy after hours of gaming. 

Of course, there is no free lunch. G-Sync does come with relatively high cost. At the time of writing this blog, the cheapest G-Sync monitor you can get from the market is still over AUD$500. And owning one of this monitor is not enough to get the G-Sync ready for yourself, you will have to get a Nvidia graphics card -- yes, unfortunately that's the only option compatible with G-Sync. Last, in order to make the best of this awesome technology, you need to have a powerful PC that can render games at a high refresh rate. While having G-Sync may cost quite some fortune, the truth is that it is absolutely worth it. 'The way it is meant to be played' as said by Nvidia, and actually pretty much 100% reviewers on the Internet appreciates this gaming-changing technology. So gamers, brace your wallet and enter the future of computer gaming.






Tuesday, 19 May 2015

FinTech: Manage your money with IT

A contraction of the words ‘Financial’ and ‘Technology’, FinTech has become a ubiquitous term for any technology applied to financial services, typically where a technology is sold into the financial services sector working for the back office functions of these customers. But recently the term has started to be used for broader applications of technology in the space to front end consumer products, to new entrants competing with existing player. Taken at its broadest, FinTech is shorthand for ‘innovation in financial services’, whether that means new products from new start-ups, or the adoption of new approaches by existing players where technology is the key enabler.

New technologies could be applied to just about anything in the financial services arena, but to name just a few: payments and transactions; mobile banking; trading; commodities markets; peer to peer lending & crowd funding; retail banking; risk & compliance; security & privacy; digital & alternative currencies; digital wallets; financial advisory services; insurance.
Just for starters, aside from darn good tech talent, the new era of financial services will be enabled by data analytics and everything that that encompasses (behavioural analytics, machine learning, mass storage, data driven marketing), also cloud computing (both from a storage and security perspective) & broader business model innovation, and exploring where the chinks in the value chain exist; and exploiting opportunities in customer relationship management processes and platforms.

Monday, 18 May 2015

Screen Less Display


Have you ever wondered how cool it would be if you could get your hands on one of those screen less and holographic displays, as seen in the Iron Man saga or the Avengers? Then wonder no more, as this piece of technology already came into existence. According to MIT’s latest technology review, screen less displays are one of the biggest technology breakthroughs; give it a few more years, and this will be readily available to you! The main purpose of this type of display is to transmit information to the user without the aid of a screen or a projector.



There are three types of screen less displays available at the time being:
·       

  •          Visual Image Screen less Display: an example of this would be holograms. These are any screen less images that the eye can see.
  • ·     Retinal Direct Display: this is when images are directly projected in the retina of the eye. Images appear to be floating in space to the user.
  • ·  Synaptic Interface: one step ahead of retinal direct display method. Information is directly transmitted to the brain, instead of projecting it to the eyes.

This technology can greatly benefit its users at it promises to provide better quality images (higher resolution), greater profitability and besides that it consumes less power.
However, it also has its disadvantages and the main one of now would be that, this technology of the future, won’t be cheap!






For further information on screen less displays, click here.

Saturday, 16 May 2015

Cefaly to prevent migrene

Nowadays it would seem like everyone has physical problems due to the increased use of technology devices. Whether it's because of the wrong sitting position in front of computers, or problems with eye sight. Although migraines can have a lot of reasons, using computers etc too much can be one of them. However, last year, Cefaly was approved by the U.S. Food and Drug Administration as the first device to prevent migraines.


It is only available by prescription, and should only be used for a maximum of 20 minutes per day. Cefaly is a headband-like device, running on battery and sits across the forehead and over the ears. It's using electrodes to stimulate brances of the 'trigeminal' nerve, which has been associated with migraines.
 Dr. Myrna Cardiel, a clinical associate professor of neurology at NYU Langone Medical Center and NYU School of Medicine in New York City says that "This device is a promising step forward in treating migraines, as it addresses an important part of what we believe triggers and maintains a migraine attack,".

The device was developed by Cefaly Technology in Belgium, and promises 71% of the treatments to be successful, and 75% less use of medications.
As this device only came out last year, I assume we will soon see more of this, and see how well it actually works on people.
For more information about Cefaly, visit this page.


Wednesday, 13 May 2015

Brain-Computer Interface

Brain-computer interface (BCI) is collaboration between a brain and a device that enables signals from the brain to direct some external activity, such as control of a cursor or a prosthetic limb. The interface enables a direct communications pathway between the brain and the object to be controlled. In the case of cursor control, for example, the signal is transmitted directly from the brain to the mechanism directing the cursor, rather than taking the normal route through the body's neuromuscular system from the brain to the finger on a mouse.

The reason a BCI works at all is because of the way our brains function. Our brains are filled with neurons, individual nerve cells connected to one another by dendrites and axons. Every time we think, move, feel or remember something, our neurons are at work. That work is carried out by small electric signals that zip from neuron to neuron as fast as 250 mph. The signals are generated by differences in electric potential carried by ions on the membrane of each neuron.
Although the paths the signals take are insulated by something called myelin, some of the electric signal escapes. Scientists can detect those signals, interpret what they mean and use them to direct a device of some kind. It can also work the other way around. For example, researchers could figure out what signals are sent to the brain by the optic nerve when someone sees the colour red. They could rig a camera that would send those exact signals into someone's brain whenever the camera saw red, allowing a blind person to "see" without eyes.