Thursday 3 November 2016

JAVA main features

 JAVA main features


Java language has distinct characteristics, making it a convenient and efficient tool in the development of distributed network applications, multi-platform applications, graphical user interfaces, Web applications, multi-threaded applications and other software development. The following options to be introduced.

        (1) development and use of simplicity. Java Basic syntax style is very similar to the C + + language, but cut the pointer in C + +, operator overloading and some easy to confuse the place. In memory management and provides a garbage collection. This allows the programmer to make more effort in implementing program functionality without having to worry about minor issues such as memory release. C + + in the complex and flexible pointer operation often leads to serious errors, has always been the development of debugging staff is very difficult issues, which does not exist in Java. The Java Virtual Machine can also link local or even remote libraries to programs, so developers do not have to worry about their details. All these provide a simple application development.

        (2) distributed. Java's support for TCP / IP protocols such as HTTP and FTP makes it easy for Java programs to establish network connections and access remote files through the Uniform Resource Locator (URL), as well as accessing local files. Java's runtime system dynamically loads the bytecodes over the network, dynamically using the new protocol control software.

        (3) object-oriented. Unlike C ++, Basic syntax in Java is very strict with object-oriented and does not allow you to define variables and methods (functions) that are independent of the class. Java class and object-based, any variables and methods can only be included in a class of internal. This makes the structure of the program clearer for inheritance and reuse to bring convenience.

        (4) security. For network applications, this is extremely important. Java security as a first consideration, set up a layer of prevention. First of all compile-time syntax, semantic checks. Link, but also to re-class over and over again the type of inspection, elimination of indirect object access. At runtime, Java's run-time system will perform bytecode checking and record the object's storage to limit access to the security context. Local classes run separately from remote classes, preventing the remote system from destroying the local system. Java-enabled browsers also allow users to control Java software access to the local system. The combination of the various measures to ensure the safety of Java programs.

        (5) platform independence and portability. Java's Application Programming Interface (API) and run-time system are key to portability. Java provides a consistent API for the various operating systems that support it. In the API interface, all Java programs will not depend on the platform. Java runtime system in the interpretation of the implementation process, the bytecode into the current machine code machine. Program developers do not need to consider the application of the hardware conditions and operating system structure, the user only needs a Java runtime system, you can run compiled bytecode.

        (6) multi-threaded. Java provides built-in multi-threaded support, the program can easily create multiple threads, each thread to perform different work. This makes the program work simple. For example, with different threads, respectively, control the sound and image, you can easily build the complexity of the effect of audio and video intertwined, and programming as long as the work of the respective threads, do not care about their cooperation, which also greatly contributed to the dynamic program interactivity and real-time.

        In order to control the action of each thread, Java also provides a thread synchronization mechanism. The internal implementation of this mechanism is based on management. This mechanism allows different threads to access shared resources to cooperate with each other to ensure data consistency, to avoid errors.

        (7) explain the implementation. Java programs are compiled to form bytecodes, which are then interpreted and executed on the virtual machine. This is the Java program can be run independently of the platform. This also makes the program beneficial for incremental linking, thereby speeding up the development process

Saturday 8 October 2016

US scientists to complete the most complex human brain directly connected experiments

About 100 billion neurons in the brain communicate with each other, forming 100 trillion synapses, the number of dense, better than the stars of the Milky Way. We can explore galaxies beyond the light-years, but we do not know much about the brain.
In 2013, the European Union and the United States have launched the "brain project" and "brain plan", respectively, within 10 years plans to invest 1 billion euros and 4.5 billion US dollars to the "final science fortress and the ultimate frontier" - "brain science"
In September, researchers at the University of Washington used a brain-to-brain approach that allowed five pairs of subjects to pass brain signals over the Internet to play quizzes. For the first time, this experiment proves that the two brains can be directly connected, and without the need for vocalization, one can accurately guess the other's ideas.
This is one of the most complex brain-brain experiments that humans have ever performed: the respondent wears an electroencephalograph (EEG) recording EEG activity. When he sees an object on the computer screen, such as a dog, Another interrogator also sees a set of possible objects and related problems. Click the mouse to ask the questioner to send a question, the respondent will answer yes or no by staring at the LEDs with different frequencies of flash on the monitor. Both answers send a signal to the interrogator via the Internet, activating the magnetic coil placed behind the head. But only the answer is "yes" to generate enough to stimulate the interrogator visual cortex of the signal, and let it see a wavy or fine linear flash, this phenomenon is called "vision." Through this feedback, the interrogator can confirm and point out the correct object.
The use of non-invasive connection, and invasive (such as implanted electrodes to the cortex) means, although the signal reception is relatively straightforward, but the signal transmission process is indeed circuitous and complex. However, brain - brain direct connection is the future of deep emotions, knowledge, memory and other "interactive" one of the bases, so even if only to achieve a Zhuopu "game", still gives people unlimited hope.
Thus, although the brain of the subject is only showing a temporary "yes" flash, but no less than 1866 Mendel in the famous pea hybridization test, was used to represent the different traits, and later on behalf of the gene group Upper and lower case letters.

Friday 7 October 2016

The first close observation of human Pluto

The largest solar system members of the ups and downs to Pluto. Since 1930, Clyde Thompson found that Pluto's controversy has been relevant, even if the global textbooks have been known as the "ninth planet" name, and finally was kicked out in 2008, downgraded to " Dwarf planet ".
NASA launched a new field of vision in 2006, the target went straight to Pluto, 9 years later, the United States Eastern time at 14:45 on July 14, 2015, "New Horizons" close range Flying Pluto, became the first to detect the distant dwarf planet of human detectors.
"New Horizons" shaped like a grand piano, carrying a 77 kg propellant, including 10.9 kg of plutonium dioxide as a power source. It is by far the fastest detector, in less than 10 years, flying 4.8 billion kilometers. Its counterparts, as part of Pluto discoverer Clyde Thompson's ashes, and NASA solicitation of 450,000 signatures. On the road, it also visited the Saturn and Neptune.
Pluto had little knowledge of Pluto, the "meeting" to make the understanding of human Pluto revolutionary changes - its diameter is 2370.6 km (± 19.3 km), the density lower than previously thought, the internal may have More ice and less rock, scientists also confirmed that the Arctic is indeed as speculated by the ice composition, but also rich in methane and nitrogen ice.
At present, the "new horizon" probe is still on the line, will enter the mysterious edge of the solar system Kuiper belt, Kuiper belt object is considered to be formed during the solar system has not yet time to grow into a planet's wreckage, recording the solar system initially formed history , So the future "Kuiper Belt" trip, in a sense, is also a glimpse of the origin of the solar system.
"New Horizon" to visit Pluto, in the global media, astronomy, as well as the majority of science and technology enthusiasts of the carnival. There is no doubt that the first human close observation of Pluto, marking the golden age of the planet to explore the arrival of the summit.

Friday 28 September 2012

New processor designs boost graphics to speed up Windows

New processor designs boost graphics to speed up Windows:


New processors from AMD and Intel will provide the horsepower for next generation desktop applications, that could previously only run on specialist workstations.

Integrated chips used to be the poor cousins of dedicated systems. PC manufacturers offered low-cost PC hardware – with integrated sound and video chips – that were lower cost than machines equipped with dedicated audio and video hardware.


But these integrated devices balanced cost with performance, and it was the performance that often suffered as a result of design compromises.

The industry has moved on and the integrated chip design, now called System on a Chip (SoC), is set to provide processing, graphics and multimedia in the next generation of PC and hybrid PC/tablet devices.

Intel's approach is called IntelHD, which offers built in graphics, although its performance is inferior to dedicated graphics processors (GPU) according to industry benchmarks. 

The company did attempt to develop its own GPU, code-named Larrabee, but this project has been dropped. However, the latest Atom SoC design, the z2760, which will power the new Dell, Fujitsu, HP Asus and Acer Windows 8 Pro tablets, uses the Imagination PowerVR graphics core chip to improve graphics.

In its market trends report for electronic equipment published in July 2012, analyst Gartner noted that graphics processing units (GPUs), digital signal processors (DSPs) and other specialised cores will take centre stage in future SoC designs. 

“The importance of multimedia content to a broad variety of electronic equipment makes the ability to manage the presentation of the content critical. For most processor architectures, this is now handled by a graphics processing core that manages the resolution and the quality of the images rendered.” 

Integrating the GPU onto a SoC design will enhance the performance in future application processor units, according to Gartner.

This is exactly what AMD has been developing since it acquired graphics card maker ATI, in 2006. 

“We put a GPU right beside a CPU core," says Adam Kozak, AMD client desktop product marketing manager. "We are implementing AES encryption (256-bit), up to four processor cores,and HT7000 graphics, all on a single chip.” 

According to Kozak, graphics processing is the chipmaker's strongest area. He says the design philosophy of the company is to concentrate on developing high-performance chips at a low cost.

The latest so-called APU chip provides 4.2GHz on the CPU, 8GHz on the GPU, which, according to AMD's data, is capable of delivering 736 GFlops.

But does a PC need all this processing power, just to run Windows 8?  

Kozak believes so. Microsoft is using its DirectX graphics interfaces to speed-up rendering of the Windows 8 user interface and Office 12 also makes use of graphics acceleration. He says that in Windows 8, the AMD processor can display three monitors from a single chip, without the need for additional graphics cards.

There is growing interest in using the powerful GPU in a PC to run supercomputer-like applications. In fact, graphics card maker Nvidia has developed Cuda (Compute Unified Device Architecture), an architecture for running computationally intensive applications on the multiple cores in its high-performance graphics card family.

But Cuda is proprietary to Nvidia. Kozak says the new Microsoft  DirectCompute programming platform will enable application developers to target the CPU and the GPU in standard way, not just on Nvidia GPUs. This will mean applications can take advantage of the raw processing power available on the high performance GPU core that now resides in the AMD SoC designs.

As an alternative to Microsoft DirectCompute, applications can also use the OpenCL programming interfaces, which effectively does the same thing. Image-processing applications like Adobe Photoshop use OpenCL to boost performance of computationally intensive graphics rendering tasks. Kozak says the OpenCL architecture can also be used in more mainstream applications, like Winzip, for speeding up the compression and decompression of zip files.

Google opens €75m energy-efficient datacentre facility in Dublin

Google opens €75m energy-efficient datacentre facility in Dublin:

Google’s new €75m Dublin datacentre facility, which houses computers that provide critical cloud-based services such as the Google search engine, Gmail and Google Maps, is now operational.
The datacentre, at Profile Park in Clondalkin, Dublin – which took almost a year to build – uses an advanced air-cooling system and takes advantage of Ireland’s naturally cool climate. According to Google, this enables it to reduce power requirements significantly.  
Using natural or free-air cooling means the facility does not require costly and power-hungry air-conditioning units, which are still used in many traditional datacentres.
“As a company committed to carbon neutrality, we make sure that our datacentres are extremely efficient in their use of electricity," said Dan Costello, Google’s global datacentre operations director. "We use around 50% less energy than a typical datacentre. The new Dublin datacentre, with its highly efficient air-cooling system, continues this trend.”
Now that it is operational, the datacentre will provide employment opportunities in a range of roles including computer technicians, electrical and mechanical engineers, catering and security staff.
The IT facility was officially opened by Ireland’s Minister for Jobs, Enterprise and Innovation, Richard Bruton.
“Cloud computing forms a key part of the government’s action plan for jobs. Our technological infrastructure is improving and cloud computing is one area where our climate gives us advantages,” said Bruton.
In total, approximately 400,000 man-hours were dedicated to design and construction of the Dublin datacentre and employed over 1,000 professionals, according to Google.
“Demand for our services has grown rapidly in the past few years and our footprint in Ireland has expanded too - we now employ over 2,500 people here in Dublin, up from around 2,000 a year ago. Our new datacentre is a long-term investment and further strengthens our ties with Ireland,”said John Herlihy, head of Google in Ireland.

Friday 21 September 2012

Fujitsu blacklisting part of tighter government policy


Fujitsu blacklisting part of tighter government policy:

The labelling of two IT suppliers as high-risk by government is part of a tightening-up of outsourcing, but how far can the government practically address the problem of failing outsourcing contracts?

Fujitsu is among the companies labelled as high-risk by government to alert all departments if a supplier has poorly performed.


Fujitsu has a number of contracts with government and a public sector IT services operation that accounts for over half of its UK business.

A Cabinet Office spokesman said the department cannot comment on the status of individual suppliers, but stressed that the government will not tolerate poor supplier performance.

He added that the government is improving its post-contract management capabilities and sharing information on supplier performance across government departments.

“We want to strengthen our contract management by reporting on suppliers’ performance against criteria and sharing the information across government. This means information on a supplier’s performance will be available and taken into consideration at the start of and during the procurement process (pre-contract),” he said. 

“Suppliers with poor performance may therefore find it more difficult to secure new work with HMG.”

He said the announcement that Fujitsu is now classified as high-risk is part of this strategy. 

“This policy will include the identification of any high-risk suppliers so that performance issues are properly taken into account before any new contracts are given.

“High-risk classification is based on material performance concerns. Suppliers deemed high-risk will be subject to particularly close scrutiny when awarding new work.”

The government said this is simple good business practice with the government emulating the private sector. But it remains to be seen how much difference will it make in a sector where so many IT services contracts are dominated by so few suppliers.

If contracts are already in place, the blacklisting will have little effect on extensions or changes in scope, said one source.

He told Computer Weekly that the blacklisting of Fujitsu has not stopped it from winning bids. 

“Fujitsu are not as down as a result of this as you would expect. Fujitsu is winning government contracts through extensions and relationships they have with other suppliers.” 

Fujitsu is part of the Aspire contact as a subcontractor to the HMRC and the Atlas consortium of suppliers to the MoD.

The Cabinet Office spokesman said he believed the government is getting tougher through the Cabinet Office. “The Cabinet Office is being much more the deciding factor than ever before and this is the most centralised control of suppliers and contracts I have seen in my time.”