Tag Archives: Cloud computing

NTT Launches Fifth Data Center In Malaysia – Cyberjaya 5!

NTT Ltd just launched their fifth data centre in Malaysia – Cyberjaya 5 (CBJ5)!

Here is a quick look at what NTT Cyberjaya 5 offers!

 

NTT Launches Fifth Data Center In Malaysia – Cyberjaya 5

On 3 February 2021, NTT Ltd announced the launch of their fifth data center in Malaysia – Cyberjaya 5 (CBJ5).

Located within the NTT Cyberjaya Campus, this new 107,000 square feet data center is designed for hyperscalers and high-end enterprises in Malaysia’s growing digital economy.

CBJ5 supports 6.5 megawatts of flexible and scalable power, and boasts a Tier IV-ready, compact and modular design, with a cooling wall system that handles up to 15 kilowatts per rack.

NTT clients will have greater access to flexible, scalable and secure infrastructure in Malaysia – a regional data center hub.

“The demand for data storage and managed hosting services is expected to grow exponentially across Malaysia. This fifth data center will meet the expanding needs of organizations to reach their digital business objectives, in particular the FSI sector, as our data center complies with the Risk Management in Technology (RMiT) guideline set by Bank Negara Malaysia. We hope to play a key role in providing the vital data capacity at a high speed to keep Malaysia’s digital ecosystems and the digital economy ticking.” said Henrick Choo, CEO, NTT Ltd. in Malaysia.

 

NTT Cyberjaya 5 : Part Of Strategic ASEAN Hub

CBJ5 is connected to the existing Asia Submarine-cable Express (ASE) and Asia Pacific Gateway (APG) cable system, and will eventually be linked to the upcoming MIST cable system.

The MIST cable system will be available by end 2022 and it is a strategic joint venture for international submarine cables in South East Asia, with Orient Link Pte. Ltd.. It will enable NTT Ltd. to expand its offerings into India and beyond, while the ASE and APG cable systems provide global connectivity from Asia to United States.

This new expansion in Malaysia is part of NTT Global Data Centres division’s growth strategy. Malaysia is a prime data centre market in the ASEAN region, due to the abundant availability of resources, and favourable government policies.

“NTT places Asia Pacific as a tactical key region, and Malaysia – a strategic hub for the submarine cables operated by NTT such as the new MIST cable system, as well as the existing Asia Submarine-cable Express (ASE) and Asia Pacific Gateway (APG). Furthermore, CBJ5 will drive business opportunities in Asia through the upcoming MIST cable system which will link all our large-scale data centers in the region. Our continued commitment to Malaysia will help position NTT as a technologically innovative leader to address the industries of the future,” said Ryuichi Matsuo, Executive Vice President for NTT Ltd.’s Global Data Centers division.

“The pandemic also illustrated the importance of effective connectivity and reliable infrastructure to ensure business continuity. NTT’s global data center platform offers flexible, scalable and secure infrastructure along with a full-stack of customizable solutions that clients can utilize to support their digital transformation needs and maintain critical applications in a comprehensive, hybrid IT environment,” he concluded.

 

Recommended Reading

Go Back To > Business | Home

 

Support Tech ARP!

If you like our work, you can help support us by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!

Hitachi Vantara : 2020 HCI Portfolio Updates Revealed!

Hitachi Vantara just unveiled their 2020 HCI (Hyperconverged Infrastructure) portfolio, with updates to the Hitachi Unified Compute Platform (UCP). Here are the details…

 

Hitachi Vantara : 2020 HCI Portfolio Updates

Hitachi Vantara today unveiled their 2020 HCI (Hyperconverged Infrastructure) portfolio, featuring updates to Hitachi UCP HC and Hitachi UCP RS.

  • Faster provisioning with new Hitachi UCP Advisor
  • Certified support for SAP HANA workloads
  • New Intel Cascade Lake Xenon Refresh processors for increased performance
  • Enhanced lifecycle management capabilities for non-disruptive upgrades

The updated 2020 Hitachi Vantara HCI solutions unify cloud infrastructure management with interoperability across their customer’s environments, whether they are using traditional storage, HCI-powered hybrid clouds or public clouds.

These new HCI offerings include a scalable and simplified foundation for hybrid clouds, allowing customers to rapidly scale-out, when increased datacenter resources are required.

Unified Cloud Management

Customers have the flexibility to build a cloud infrastructure with seamless workload and data mobility across on-premises and public cloud environments.

Hitachi UCP Advisor accelerates provisioning up to 80% faster compared to previous HCI management tools and reduces management complexity across the environment.

Scalable Performance

Greater performance, scale and density support IT departments’ ability to rapidly scale data center resources for business-critical applications and lower operational overhead for better TCO.

The updated HCI platforms provide certified support for SAP HANA workloads on HCI. Intel Cascade Lake Refresh Xeon processors increase performance for workload consolidation while avoiding resource contention issues.

Hitachi HCI solutions help reduce CapEx and OpEx overhead with advanced automation and data efficiency technologies.

Simplified Consumption

Everflex from Hitachi Vantara provides simple, elastic and comprehensive acquisition choices for the entire Hitachi Vantara portfolio, including the UCP Family, with consumption-based pricing models that align IT spend with business use and help lower costs by up to 20% with pay-as-you-go pricing.

Customers can also accelerate time to production with pre-validated and optimized bundles and starter packs, including solutions enabling remote work.

 

2020 Hitachi Vantara HCI Portfolio : Availability

The 2020 Hitachi Unified Compute Platform HC and Hitachi Unified Compute Platform RS are available from Hitachi Vantara and their global network of business partners, with immediate effect.

 

Recommended Reading

Go Back To > Business + Enterprise | Home

 

Support Tech ARP!

If you like our work, you can help support us by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!


Google Cloud Confidential VM With 2nd Gen AMD EPYC!

Google recently introduced Confidential Computing, with Confidential VM as the first product, and it’s powered by 2nd Gen AMD EPYC!

Here’s an overview of Confidential Computing and Confidential VM, and how they leverage the 2nd Gen AMD EPYC processor!

 

Google Cloud Confidential Computing : What Is It?

Google Cloud encrypts customer data while it’s “at-rest” and “in-transit“. But that data must be decrypted because it can be processed.

Confidential Computing addresses that problem by encrypting data in-use – while it’s being processed. This ensures that data is kept encrypted while in memory and outside the CPU.

 

Google Cloud Confidential VM, Powered By 2nd Gen AMD EPYC

The first product that Google is unveiling under its Confidential Computing portfolio is Confidential VM, now in beta.

Confidential VM basically adds memory encryption to the existing suite of isolation and sandboxing techniques Google Cloud uses to keep their virtual machines secure and isolated.

This will help customers, especially those in regulated industries, to better protect sensitive data by further isolating their workloads in the cloud.

Google Cloud Confidential VM : Key Features

Powered By 2nd Gen AMD EPYC

Google Cloud Confidential VM runs on N2D series virtual machines powered by the 2nd Gen AMD EPYC processors.

It leverages the Secure Encrypted Virtualisation (SEV) feature in 2nd Gen AMD EPYC processors to keep VM memory encrypted with a dedicated per-VM instance key.

These keys are generated and managed by the AMD Secure Processor inside the EPYC processor, during VM creation and reside only inside the VM – making them inaccessible to Google, or any other virtual machines running on the host.

Your data will stay encrypted while it’s being used, indexed, queried, or trained on. Encryption keys are generated in hardware, per virtual machine and are not exportable.

Confidential VM Performance

Google Cloud worked together with the AMD Cloud Solution team to minimise the performance impact of memory encryption on workloads.

They added support for new OSS drivers (name and gvnic) to handle storage traffic and network traffic with higher throughput than older protocols, thus ensuring that Confidential VM will perform almost as fast as non-confidential VM.

Easy Transition

According to Google, transitioning to Confidential VM is easy – all Google Cloud Platform (GCP) workloads can readily run as a Confidential VM whenever you want to.

Available OS Images

In addition to the hardware-based inline memory encryption, Google built Confidential VM on top of Shielded VM, to harden your OS image and verify the integrity of your firmware, kernel binaries and drivers.

Google currently offers images of Ubuntu v18.094, Ubuntu 20.04, Container Optimized OS (COS v81), and RHEL 8.2.

They are currently working with CentOS, Debian and other distributors to offer additional OS images for Confidential VM.

 

Recommended Reading

Go Back To > Computer | BusinessHome

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!


AMD EPYC : Four Supercomputers In Top 50, Ten In Top 500!

AMD is on the roll, announcing more supercomputing wins for their 2nd Gen EPYC processors, including four supercomputers in the top 50 list, and ten in the top 500!

 

2nd Gen AMD EPYC : A Quick Primer

The 2nd Gen AMD EPYC family of server processors are based on the AMD Zen 2 microarchitecture and fabricated on the latest 7 nm process technology.

According to AMD, they offer up to 90% better integer performance and up to 79% better floating-point performance, than the competing Intel Xeon Platinum 8280 processor. For more details :

Here is a quick 7.5 minute summary of the 2nd Gen EPYC product presentations by Dr. Lisa Su, Mark Papermaster and Forrest Norrod!

 

AMD EPYC : Four Supercomputers In Top 50, Ten In Top 500!

Thanks to the greatly improved performance of their 2nd Gen EPYC processors, they now power four supercomputers in the top 50 list :

Top 50 Rank Supercomputer Processor
7 Selene
NVIDIA DGX A100 SuperPOD
AMD EPYC 7742
30 Belenos
Atos BullSequana XH2000
AMD EPYC 7H12
34 Joilot-Curie
Atos BullSequana XH2000
AMD EPYC 7H12
48 Mahti
Atos BullSequana XH2000
AMD EPYC 7H12

On top of those four supercomputers, there are another six other supercomputers in the Top 500 ranking, powered by AMD EPYC.

In addition to powering supercomputers, AMD EPYC 7742 processors will soon power Gigabyte servers selected by CERN to handle data from their Large Hadron Collider (LHC).

 

3rd Gen AMD EPYC Supercomputers

AMD also announced that two universities will deploy Dell EMC PowerEdge servers powered by the upcoming 3rd Gen AMD EPYC processors.

Indiana University

Indiana University will deploy Jetstream 2 – an eight-petaflop distributed cloud computing system, powered by the upcoming 3rd Gen AMD EPYC processors.

Jetstream 2 will be used by researchers in a variety of fields like AI, social sciences and COVID-19 research.

Purdue University

Purdue University will deploy Anvil – a supercomputer powered by the upcoming 3rd Gen AMD EPYC processors, for use in a wide range of computational and data-intensive research.

AMD EPYC will also power Purdue University’s community cluster “Bell”, scheduled for deployment in the fall.

 

Recommended Reading

Go Back To > Computer Hardware | Business | Home

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!


Cyber Crime WhatsApp Warning Hoax Debunked!

The Cyber Crime hoax on WhatsApp is circulating, warning people that they are being monitored by the government.

Well, not to worry – unless you are living in China – this is yet another Internet hoax. Here is why we know that…

 

The Cyber Crime WhatsApp Warning Hoax

This is the Cyber Crime warning hoax that has been circulating on WhatsApp :

From tomorrow onwards there are new communication regulations.

All calls are recorded

All phone call recordings saved

WhatsApp is monitored

Twitter is monitored

Facebook is monitored

All social media and forums are monitored

Inform those who do not know.

Your devices are connected to ministry systems.

Take care not to send unnecessary messages

Inform your children, Relatives and friends about this to take care

​​Don’t forward any posts or videos etc., you receive regarding politics/present situation about Government/PM etc.​​

Police have put out a notification termed ..Cyber Crime … and action will be taken…just don’t delete …

Inform your friends & others too.

Writing or forwarding any msg on any political & religious debate is an offence now….arrest without warrant…

This is very serious, plz let it be known to all our groups and individual members as group admin can b in deep trouble.

Take care not to send unnecessary messages.
Inform everyone about this to take care.

Please share it; it’s very much true. Groups please be careful.

Note that it’s generic enough that it can apply to almost any government in the world.

 

The Cyber Crime WhatsApp Warning Hoax Debunked!

And here is why this is nothing more than yet another Internet hoax :

Only China Is Capable Of Doing This

The only country that has accomplished most of what was shared above is China, but it took them decades to erect the Great Firewall of China.

It’s not just the massive infrastructure that needs to be created, it also requires legislation to be enacted, and considerable manpower and resources to maintain such a system.

That’s why China is leaning heavily on AI and cloud computing capabilities to automatically and quickly censor information deemed “sensitive”.

However, no other country has come close to spending the money and resources on a similar scale, although Cuba, Vietnam, Zimbabwe and Belarus have imported some surveillance technology from China.

WhatsApp, Instagram + Facebook Messenger Have End-to-End Encryption

All three Facebook-owned apps are now running on the same common platform, which provides end-to-end encryption.

End-to-end encryption protects messages as they travel through the Internet, and specifically prevents anyone (bad guys or your friendly government censor) from snooping into your conversation.

That is also why all three are banned in China…

The Police Cannot Enact Laws

There are cybercrime laws in most, if not every, country in the world. But they are all enacted by legislative bodies of some sort, not the police.

The police is the executive arm in a country, empowered to enforce the law. They do not have the power to create a law, and then act on it.

Even The Government Has Debunked It!

Just in case you are still not convinced, even the Malaysian government issued a fact check on this hoax, debunking it as fake news :

Basically, it states “The Ministry of Home Affairs has NEVER recorded telephone calls or monitored social media in this country“.

 

Recommended Reading

Go Back To > Cybersecurity | Home

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!

NTT To Build 5th Data Centre In Cyberjaya, Malaysia!

NTT Limited just announced that they will construct their 5th data centre at the NTT Cyberjaya Campus in Malaysia!

Here are the details…

 

NTT To Build 5th Data Centre In Cyberjaya, Malaysia

As part of their expansion plans, NTT will build their fifth data centre at their Cyberjaya campus in Malaysia. This Tier-4 ready, compact and modular data centre, called CBJ5, is scheduled to be complete in 2020.

Once it comes online, NTT says that it will provide their clients with “a flexible and scalable power and cooling solution coupled with an industry leading Service Level Agreement (SLA) that is the first of its kind in Malaysia.

This announcement comes after Malaysia’s Budget 2020 was tabled, with a focus on driving economic growth through digital transformation. There will be additional grants and incentives for organisations to digitally transform their businesses.

As such, NTT believes there will be increased demand for their services, which CBJ5 will be ready to fulfil.

“CBJ5 is designed to meet the requirements of hyperscalers and high-end enterprises, especially those that require solid power management capabilities. CBJ5 is able to accommodate progressive power increments and cooling of up to 10kW/rack. This is revolutionary as it will allow our clients to maximize the power resources in their chosen data center,” said Henrick Choo, CEO, Malaysia for NTT Ltd.

 

NTT Data Centre Capabilities

With CBJ5, NTT aims to become the leading Digital Infrastructure Provider in Malaysia, attracting both domestic and global traffic into its carrier-neutral data centre campus that also includes NTT’s global Tier-1 IP network, Multi-Cloud Connect platform and domestic Internet Exchange.

The NTT Cyberjaya Campus  features a high-density fibre network facilitating inter-connection among our clients to create a digital supply chain ecosystem.

Bank Negara Malaysia recently announced a set of guidelines defining Risk Management in Technology (RMiT) for financial institutions, meaning security becomes a crucial consideration as they are now responsible for the safety of the bank’s information infrastructure, systems and data.

NTT will address these new requirements by working together with financial institutions, so they are able to comply to BNM’s guideline. Henrick also stated that security will be heightened as the company grows.

“NTT’s physical data center access control will be increased to safeguard all data center blocks within the NTT Cyberjaya Campus. We will also be introducing smart security technology integrating Visitor Management Systems with facial recognition technology. Essentially, we will double our security cover with the two combined,” he added.

NTT clients are able to choose from multiple architectures from on-premise, to cloud and even multi-cloud. All solutions will come with managed security solutions to offer data protection while minimising business disruptions.

 

Recommended Reading

Go Back To > Enterprise + Business | Home

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!


DiDi Adopts NVIDIA AI + GPUs For Self-Driving Cars!

At GTC China 2019, DiDi announced that they will adopt NVIDIA GPUs and AI technologies to develop self-driving cars, as well as their cloud computing solutions.

 

DiDi Adopts NVIDIA AI + GPUs For Self-Driving Cars!

This announcement comes after DiDi spliced out their autonomous driving unit as an independent company in August 2019.

In their announcement, DiDi confirmed that they will use NVIDIA technologies in both their data centres and onboard their self-driving cars :

  • NVIDIA GPUs will be used to train machine learning algorithms in the data center
  • NVIDIA DRIVE will be used for inference in their Level 4 self-driving cars

NVIDIA DRIVE will fuse data from all types of sensors – cameras, LIDAR, radar, etc – and use numerous deep neural networks (DNNs) to understand the surrounding area, so the self-driving car can plan a safe way forward.

Those DNNs (deep neural networks) will require prior training using NVIDIA GPU data centre servers, and machine learning algorithms.

Recommended : NVIDIA DRIVE AGX Orin for Autonomous Vehicles Revealed!

 

DiDi Cloud Computing Will Use NVIDIA Tech Too

DiDi also announced that DiDi Cloud will adopt and launch new vGPU (virtual GPU) cloud servers based on NVIDIA GPUs.

The new vGPU licence mode will offer more affordable and flexible GPU cloud computing services for remote computing, rendering and gaming.

 

Recommended Reading

Go Back To > Automotive | Business | Home

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!


Intel oneAPI Unified Programming Model Overview!

At Supercomputing 2019, Intel unveiled their oneAPI initiative for heterogenous computing, promising to deliver a unified programming experience for developers.

Here is an overview of the Intel oneAPI unified programming model, and what it means for programmers!

 

The Need For Intel oneAPI

The modern computing environment is now a lot less CPU-centric, with the greater adoption of GPUs, FGPAs and custom-built accelerators (like the Alibaba Hanguang 800).

Their different scalar, vector, matrix and spatial architectures require different APIs and code bases, which complicates attempts to utilise a mix of those capabilities.

 

Intel oneAPI For Heterogenous Computing

Intel oneAPI promises to change all that, offering a unified programming model for those different architectures.

It allows developers to create workloads and applications for multiple architectures on their platform of choice, without the need to develop and maintain separate code bases, tools and workflow.

Intel oneAPI comprises of two components – the open industry initiative, and the Intel oneAPI beta toolkit :

oneAPI Initiative

This is a cross-architecture development model based on industry standards, and an open specification, to encourage broader adoption.

Intel oneAPI Beta Toolkit

This beta toolkit offers the Intel oneAPI specification components with direct programming (Data Parallel C++), API-based programming with performance libraries, advanced analysis and debug tools.

Developers can test code and workloads in the Intel DevCloud for oneAPI on multiple Intel architectures.

 

What Processors + Accelerators Are Supported By Intel oneAPI?

The beta Intel oneAPI reference implementation currently supports these Intel platforms :

  • Intel Xeon Scalable processors
  • Intel Core and Atom processors
  • Intel processor graphics (as a proxy for future Intel discrete data centre GPUs)
  • Intel FPGAs (Intel Arria, Stratix)

The oneAPI specification is designed to support a broad range of CPUs and accelerators from multiple vendors. However, it is up to those vendors to create their own oneAPI implementations and optimise them for their own hardware.

 

Are oneAPI Elements Open-Sourced?

Many oneAPI libraries and components are already, or will soon be open sourced.

 

What Companies Are Participating In The oneAPI Initiative?

According to Intel, more than 30 vendors and research organisations support the oneAPI initiative, including CERN openlab, SAP and the University of Cambridge.

Companies that create their own implementation of oneAPI and complete a self-certification process will be allowed to use the oneAPI initiative brand and logo.

 

Available Intel oneAPI Toolkits

At the time of its launch (17 November 2019), here are the toolkits that Intel has made available for developers to download and use :

Intel oneAPI Base Toolkit (Beta)

This foundational kit enables developers of all types to build, test, and deploy performance-driven, data-centric applications across CPUs, GPUs, and FPGAs. Comes with :

[adrotate group=”2″]
  • Intel oneAPI Data Parallel C++ Compiler
  • Intel Distribution for Python
  • Multiple optimized libraries
  • Advanced analysis and debugging tools

Domain Specific oneAPI Toolkits for Specialised Workloads :

  • oneAPI HPC Toolkit (beta) : Deliver fast C++, Fortran, OpenMP, and MPI applications that scale.
  • oneAPI DL Framework Developer Toolkit (beta) : Build deep learning frameworks or customize existing ones.
  • oneAPI IoT Toolkit (beta) : Build high-performing, efficient, reliable solutions that run at the network’s edge.
  • oneAPI Rendering Toolkit (beta) : Create high-performance, high-fidelity visualization applications.

Additional Toolkits, Powered by oneAPI

  • Intel AI Analytics Toolkit (beta) : Speed AI development with tools for DL training, inference, and data analytics.
  • Intel Distribution of OpenVINO Toolkit : Deploy high-performance inference applications from device to cloud.
  • Intel System Bring-Up Toolkit (beta) : Debug and tune systems for power and performance.

You can download all of those toolkits here.

 

Recommended Reading

Go Back To > Business + Enterprise | Home

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!


Samsung – IBM AI IoT Cloud Platform For 5G Mobile Solutions!

At the Samsung Developer Conference 2019, Samsung and IBM announced a joint platform that leverages Samsung Galaxy devices and IBM cloud technologies to introduce new 5G, AI-powered mobile solutions!

Here is what you need to know about this new Samsung-IBM AI IoT cloud platform, and the 5G AI-powered mobile solutions it’s powering for governments and enterprises.

 

Samsung – IBM AI IoT Cloud Platform For 5G Mobile Solutions!

Built using IBM Cloud technologies and Samsung Galaxy mobile devices, the new platform will help improve the work environment for employees in high-stress or high-risk occupations.

This will help reduce the risks to these public employees who work in dangerous and high-stress situations. This is critical because nearly 3 million deaths occur each year due to occupational accidents.

This new, unnamed Samsung-IBM platform will help governments and enterprises track their employee’s vitals, including heart rate and physical activity. This will allow them to determine if that employee is in distress and requires help.

 

The Samsung – IBM AI IoT Cloud Platform In Use

5G mobile solutions based on the new Samsung-IBM AI IoT platform is being piloted by multiple police forces to monitor their health in real-time, and provide situational awareness insights to first responders and their managers.

The platform can track in real time, the safety and wellness indicators of first responders equipped with Samsung Galaxy Watches and Galaxy smartphones with 5G connectivity.

It can instantly alert emergency managers if there is a significant change in the safety parameters, which may indicate the first responder is in danger of a heart attack, heat exhaustion or other life-threatening events.

This allows them to anticipate potential dangers, and quickly send assistance. This should greatly reduce the risk of death and injuries to their employees.

 

Recommended Reading

Go Back To > Business + Enterprise | Mobile | Home

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!


3rd Gen X-Dragon Architecture by Alibaba Cloud Explained!

At the Apsara Conference 2019, Alibaba Cloud announced that they will be introducing the 3rd Gen X-Dragon Architecture for their cloud servers!

Here is a quick PRIMER on the new 3rd Gen X-Dragon Architecture!

 

What Is X-Dragon?

X-DragonShenlong in Chinese – is a proprietary bare metal server architecture developed by Alibaba Cloud for their cloud computing requirements.

Built around a custom X-Dragon MOC card, it delivers what Alibaba Cloud calls Elastic Compute Service (ECS) capability in a bare metal server.

The ECS bare metal instances it offers combine the benefits of bare metal servers, and virtual machines.

For example, it offers direct access to CPU and RAM resources without virtualisation overheads that bare metal servers offer, with the instant deployment and image migration capabilities of virtual machines.

The downsides? ECS bare metal instances, once deployed, cannot be upgraded or downgraded. In addition, if there is a hardware failure, a failover occurs and the data remains stored in the instance’s storage drives.

 

What’s New In The 3rd Gen X-Dragon Architecture?

Basically – SPEED.

[adrotate group=”2″]

According to Alibaba Cloud, the 3rd Gen X-Dragon architecture is able to increase Queries Per Second (QPS) by 30% and decrease latency by 60% in e-commerce scenarios.

In tandem, they also announced the 6th Gen ECS instance, which delivers a 20% boost in computing power, a 30% reduction in memory latency, and a 70% reduction in storage IO latency.

Not new, but also important is the fact that because it is cloud-native by design, it eliminates power wastage from idle bare metal servers. Alibaba Cloud claims that alone reduces the unit computing cost by 50%.

 

3rd Gen X-Dragon Architecture Availability

Alibaba Cloud will start rolling out the 3rd Gen X-Dragon architecture upgrade to millions of their cloud servers around the world from 2020 onwards.

 

Recommended Reading

Go Back To > Business + EnterpriseComputer | Home

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!


Dimension Data Expert Panels On Cyberattack Mitigation + Cloud Security

Dimension Data organised two expert panels on cyberattacks and cloud security, as part of their coverage of the 2019 NTT Security Global Threat Intelligence Report.

Find out what cybersecurity experts from Dimension Data, Cisco and more think about cloud security, cyberattacks and mitigating them.

 

Dimension Data Expert Panels On Cyberattack Mitigation + Cloud Security

Freda Liu hosted the two expert panels with Cisco, Recorded Future, F5 and Cybersecurity Malaysia and Mark Thomas, Dimension Data’s VP of Cybersecurity.

The two expert panels addressed the chief concerns of their clients, namely on cloud security, and the mitigation of cyberattacks.

 

Dimension Data Panel #1 : Top Cyberattacks + Mitigation Tips

Enterprises are continuously experiencing cyberattack survey in today’s digital world. Challenges like compliance management, coin mining, web-based attacks, and credential theft have been seen over the past year.

In this session, the Dimension Data panel of experts will provide insights about top cyberattacks and shifting threat landscape. They also discussed best practices and practical measures you can take to bolster your cybersecurity defences.

 

Dimension Data Panel #2 : Security In The Cloud

Today, cybersecurity leaders’ jobs are made more difficult as the number of areas and ‘things’ that need to be secured is constantly increasing.

Your infrastructure is no longer just physical, it’s cloud, and hybrid too.

What are the people, process and tools you need in place to help improve your organisation’s resilience and embark on the journey to world-class cybersecurity?

 

Recommended Reading

[adrotate group=”2″]

Go Back To > CybersecurityEnterprise + Business | Home

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!


Pat Gelsinger Reveals 2019 VMware Strategy + Plans!

On the second day of Dell Technologies World 2019, VMware CEO Pat Gelsinger shared his 2019 vision for VMware. Here is a sneak peek at the 2019 VMware strategy and plans!

 

Pat Gelsinger Reveals 2019 VMware Strategy + Plans!

VMware was a major force at Dell Technologies World 2019, demonstrating VMware’s great importance in the Dell Technologies family.

They announced amongst other things, Dell Unified Workspace which is based on VMware Workspace ONE, Dell Technologies Cloud in collaboration with Dell EMC, VMware Cloud on Dell EMC, and Azure VMware Solutions in collaboration with Microsoft.

But that was not all, as Pat Gelsinger would soon reveal…

 

Pat Gelsinger Reveals 2019 VMware Strategy + Plans!

VMware CEO Pat Gelsinger comes from a “hardware” background, serving as Intel’s first Chief Technology Officer before taking over as President and CEO of EMC.

A point he makes in his talk about the Superpowers of Tech – Cloud, Mobile, AI/ML and Edge/IoT.

  • VMware’s vision still focuses on enabling any cloud resource and application, from the past or the present, to work on any device, while improving intrinsic security over time.
  • The hybrid cloud is the best answer for almost every single workload, because of three “laws” – the laws of physics, the laws of economics, and the laws of the land.
  • VMware and Dell Technologies are focusing on the hybrid cloud architecture with VxRail as its building block.
  • To bind together and manage disparate cloud and on-premise solutions with greater visibility, VMware is offering CloudHealth on all VMware Cloud solutions on Amazon, Azure and Google.
  • VMware is making great investments into Kubernetes as the “middleware for the cloud“.
  • VMware is partnering with Pivotal to make VMware PKS available as VMware Enterprise PKS, VMware Essential PKS and VMware Cloud PKS.
  • VMware is also rebuilding their security architecture and products, with AppDefense and vSphere Platinum, giving virtual machines an AI capability to learn the users’ behaviour, as well as end-to-end encryption throughout the network infrastructure.
  • The newly-announced Dell Unified Workspace leverages VMware’s Workspace ONE unified endpoint management to maintain the user’s devices in good health, while allowing them to seamlessly access any native, SaaS (Software as a Service), or internal application, with a single sign-on from any device.

 

Recommended Reading

Go Back To > Business + Enterprise | Home

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!


Here Are The 2019 Virtustream Multicloud Survey Results!

The 2019 Virtustream Multicloud Survey results was just announced with some interesting insights into the multicloud strategies being employed globally. Here is our quick primer on what the study revealed!

 

The 2019 Virtustream Multicloud Survey

Virtustream, a Dell Technologies company, commissioned Forrester Consulting to conduct their latest global survey of the multicloud strategy of more than 700 companies globally.

Virtustream commissioned the survey to study the current state of enterprise IT strategies for cloud-based workloads.

 

The 2019 Virtustream Multicloud Survey Summarised

The survey report, titled Multicloud Drive Mission-Critical Benefits, found that :

  • almost all (97%) of those companies used multicloud strategies in their mission-critical applications.
  • two thirds use multiple vendors for mission-critical workloads
  • multicloud deployments will increase over the next few years, with businesses expanding their multicloud budgets for staffing, investments and training
  • nearly 90% of companies surveyed will maintain or increase their budget to boost their multicloud deployments
  • nearly 75% of companies surveyed are using multiple cloud providers for mission-critical applications
  • nearly 61% of companies surveyed were concerned about security and management of their multicloud strategies.

 

Benefits Of Multicloud Strategies For Mission-Critical Applications

A majority of business organizations shared that multicloud strategies was used in mission-critical cases that involved customers’ financial data or sales applications.

In fact, the survey found that nearly 75% of business organisations are using about 2-3 cloud providers for business-critical applications.

Among the main benefits of using multicloud strategies were quick and efficient response to business changes and challenges.

An increased performance and savings in operational costs were also counted as additional benefits.

 

Security + Management Concerns In The 2019 Virtustream Multicloud Survey

Management and deployment of multicloud strategies are complex, which is why many business organisations face issues with its implementation.

Although nearly 61% respondents admit that multicloud strategies complements their own business objectives, there were still concerns of security and management.

Thus, many companies are planning to increase qualified and skilled staff to support their multicloud strategies, and work with cloud vendors with proper expertise and experience.

Recommended Reading

Go Back To > Business + Enterprise | Home

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!


Microsoft Build 2019 : New Azure Technologies Unveiled!

A host of new Microsoft Azure technologies for developers have been announced at the Microsoft Build 2019 conference, which took place in Seattle. Here is a primer on what they announced!

 

Microsoft Build 2019 : New Azure Technologies Unveiled!

With nearly 6,000 developers and content creators attending Microsoft Build 2019 in Seattle, Microsoft announced a series of new Azure services like hybrid loud and edge computing to support them. They include advanced technologies such as,

  • Artificial Intelligence (AI)
  • Mixed reality
  • IoT (Internet of Things)
  • Blockchain

 

Microsoft Build 2019 : New Azure AI Technologies

First of all, they unveiled a new set of Microsoft Azure AI technologies to help developers and data scientists utilize AI as a solution :

  • Azure Cognitive Services, which iwll enable applications to see, hear, respond, translate, reason and more.
  • Microsoft will add the “Decision” function to Cognitive Services to help users make decisions through highly specific and customized recommendations.
  • Azure Search will also be further enhanced with an AI feature.

 

Microsoft Build 2019 : New Microsoft Azure Machine Learning Innovations

Microsoft Azure Machine Learning has been enhanced with new machine learning innovations designed to simplify the building, training and deployment of machine learning models. They include :

  • MLOps capabilities with Azure DevOps
  • Automated ML advancements
  • Visual machine learning interface

Microsoft Build 2019 : New Edge Computing Solutions

Microsoft also aims to boost edge computing by introducing these new solutions:

  • Azure SQL Database Edge
  • IoT Plug and Play
  • HoloLens 2 Developer Bundle
  • Unreal Engine 4

Microsoft Build 2019 : Azure Blockchain Service

The Azure Blockchain Workbench, which Microsoft released last year to support development of blockchain applications, has been further enhanced this year with the Azure Blockchain Service.

Azure Blockchain Service is a tool that simplifies the formation and management of consortium blockchain networks so companies only need to focus on app development.

J.P Morgan’s Ethereum platform was introduced by Microsoft as the first ledger available in the Azure Blockchain Service.

 

Recommended Reading

Go Back To > Business + Enterprise | Home

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!


The 2019 Microsoft – Dell Partnership : All You Need To Know!

One of the biggest stories from Dell Technologies World 2019 was the new Microsoft – Dell partnership. Microsoft CEO Satya Nadella himself joined Michael Dell to talk about it!

 

The 2019 Microsoft – Dell Partnership

Dell Technologies and Microsoft have been partners for a long time now, but at Dell Technologies World 2019, they announced that they were expanding their partnership in a number of digital transformation solutions.

Let’s take a look at what the components of the 2019 Microsoft – Dell partnership…

 

Azure VMware Solutions – VMware On Azure!

For the first time ever, Microsoft will offer VMware on Azure! The new Azure VMware Solutions are built on VMware Cloud Foundation, and deployed in Azure.

This would allow companies to capitalise on VMware’s trusted cloud infrastructure and the mission-critical performance of Microsoft Azure.

Hybrid Cloud Connectivity

With Azure VMware Solutions, customers will be able to seamlessly migrate, extend and run existing VMware workloads from on-premises environments to Microsoft Azure without the need to re-architect applications or retool operations.

They will also be able to build, run, manage, and secure new and existing applications across VMware environments and Microsoft Azure, while extending a single model for operations based on established tools, skills and processes as part of a hybrid cloud strategy.

Tapping Into Azure Capabilities

Azure VMware Solutions enable organisations to tap into Azure’s scale, security and fast provisioning cycles to innovate and modernise applications while also improving performance.

By integrating with native Azure services, customers can easily infuse advanced capabilities like AI, Machine Learning, and IoT into their applications enabling new, intelligent experiences.

Metal-as-a-Service

Azure VMware Solutions are first-party services from Microsoft developed in collaboration with VMware Cloud Verified partners CloudSimple and Virtustream (a Dell Technologies company).

Both CloudSimple and Virtustream run the latest VMware software-defined data center technology.

This ensures that customers enjoy the same benefits of a consistent infrastructure and consistent operations in the cloud as they achieve in their own physical data center, while allowing customers to also access the capabilities of Microsoft Azure.

 

New Microsoft – Dell Workspace Solutions

The 2019 Microsoft – Dell partnership also sees a collaboration in digital workspace solutions between Microsoft and both Dell Technologies and VMware.

Microsoft 365 + VMware Workspace ONE

Customers who use both Microsoft 365 and VMware Workspace ONE will now be able to leverage both solutions to maximise their investments.

Specifically, they will be able to use Workspace ONE to manage and secure Office 365 across devices through cloud-based integration with Microsoft Intune and Azure Active Directory.

Dell Provisioning Services Integration

Through the new Dell Technologies Unified Workspace, customers can leverage the integration of Microsoft Windows Autopilot and Dell Device Provisioning and Deployment Services, like Dell ProDeploy – all enabled by the integration of Microsoft 365, Workspace ONE, and Dell Provisioning Services.

Windows Virtual Desktop

Microsoft also announced Windows Virtual Desktop, the only service that delivers a multi-session Windows 10 experience, optimisations for Office 365 ProPlus, and support for Windows Server Remote Desktop Services (RDS) desktops and apps.

As a part of this agreement, VMware will extend the capabilities of Microsoft Windows Virtual Desktop to enable customers to further accelerate their cloud initiatives, leveraging VMware Horizon Cloud on Microsoft Azure.

Initial capabilities are expected to be available as a tech preview by the end of calendar year 2019.

VMware + Azure Integration

Microsoft and VMware are also exploring initiatives to drive further integration between VMware infrastructure and Azure such as integration of VMware NSX with Azure Networking and integration of specific Azure services with VMware management solutions.

They will also be exploring bringing specific Azure services to the VMware on-premises customers. Through this collaboration, the companies aim to give customers a more seamless experience across VMware and Azure environments.

 

Recommended Reading

Go Back To > Enterprise + Business | Home

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!


2019 ASEANDSE | ASEAN Data Science Explorers Launched!

SAP and the ASEAN Foundation just announced that application for the 2019 ASEANDSE | ASEAN Data Science Explorers programme is now open! Here are the full details!

 

What Is ASEANDSE | ASEAN Data Science Explorers?

The ASEANDSE (ASEAN Data Science Explorers) programme is a joint collaboration between SAP and the ASEAN Foundation. It aims to promote and galvanise the use of data science amongst ASEAN tertiary students.

It aims to do this through two key activities – a series of enablement sessions, and a data analytics competition. Since its introduction in 2017, ASEANDSE has empowered over 5,000 youths from 287 higher education institutions in the ASEAN region.

 

The 2019 ASEANDSE Programme

The 2019 ASEANDSE programme will be carried out from February to October 2019. It starts with enablement sessions that are designed to improve the data analytics skills and knowledge of both students and lecturers at local institutions of higher learning across the ASEAN region.

These enablement sessions will be followed by a national, and then regional, data analytics competition. At these competitions, student teams of two will present their data-driven proposals using the SAP Analytics Cloud service.

Their ASEANDSE competition proposals must tackle issues affecting their country or ASEAN in general, according to these UN Sustainable Development Goals :

  • Good health and well-being
  • Quality education
  • Gender equality
  • Decent work and economic growth
  • Industry, innovation and infrastructure
  • Sustainable cities and communities

One team from each ASEAN member state will be crowned as the national finalist before advancing to the 2019 ASEANDSE regional finals, which will be held in Bangkok on 16 October 2019.

There, the 10 national finalists will be given the opportunity to present their winning ideas to a panel of judges made up of distinguished representatives from the ASEAN Foundation, SAP and various government officials and selected NGO organisations.

 

Where To Join The 2019 ASEANDSE Programme

The 2019 ASEANDSE programme is now open for registration, until 10 May 2019. Here are the eligibility requirements :

  1. Nationals of ASEAN member countries (ie. Brunei, Cambodia, Indonesia, Laos, Malaysia, Myanmar, Philippines, Singapore, Thailand and Vietnam)
  2. Full-time tertiary students currently pursuing their Diploma or Undergraduate studies in one of the tertiary institutions in Southeast Asia.
  3. Above the age of 16 as at the start of the Contest Period. Participants under the age of 18 must obtain parental consent. The consent form is available upon registration.

If you meet those requirements, you can register here!

 

Recommended Reading

[adrotate group=”2″]

Go Back To > Business + Enterprise | Home

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!

Microsoft Ignite 2019 @ Sydney – A Quick Tour!

Microsoft Ignite 2019 is here in Sydney! It was sold out, of course, but we managed to get a ticket.

So here is our quick tour for everyone else who could not make it, but wanted to see what it’s all about!

 

Microsoft Ignite 2019 @ Sydney

The Microsoft Ignite 2019 @ Sydney is a Microsoft tech conference for developers and tech professionals. For two days, it offers skill-building workshops, networking opportunities and access to top Microsoft engineers. No wonder tickets were quickly sold out!

For developers and tech professionals who want to learn how to better leverage Microsoft’s cloud services, the 2019 Microsoft Ignite is a rare opportunity to learn directly from those who build Power BI, Microsoft 365, and Azure.

  • Building your Applications for the Cloud – Learn how to architect and build your applications to take advantage of the scale that the cloud offers.
  • Deploying your Desktop & Infrastructure – Discover how to begin your journey to the cloud and what steps you need to both move your infrastructure and deploy your desktop.
  • ‘Getting the most of your Data – Learn how to use AI and Machine Learning to gain new understanding using existing data within your organisation.
  • Migrating Applications to the Cloud – Learn what it takes to modernise your applications and prepare them for successful migration to the cloud.
  • Optimise Teamwork in your Organisation – Reveal the collaborative workforce within your business and learn how to build effective teams.
  • Securing your Organisation – Build a secure organisation without compromising the productivity of your business.

 

Microsoft Ignite 2019 @ Sydney : A Quick Tour!

Microsoft Ignite 2019 @ Sydney was held in the International Convention Center Sydney (ICC Sydney) over two days – 13th and 14th February 2019. Yes, even on Valentine’s Day. These are truly dedicated professionals!

The Microsoft Ignite 2019 @ Sydney covered two halls, as well as several lecture halls. In our quick tour, we take a look at the Hub, where participants get to interact directly with Microsoft professionals and their partners.

Note : We intentionally recorded our tour while most participants were inside the lecture halls for the workshops. Otherwise, the halls would be swarmed with people, making it difficult for us to record our tour.

 

The 2019 Imagine Cup Asia Showcase

As the 2019 Imagine Cup Asia concluded a day earlier, Microsoft also took the opportunity to showcase the projects of the 12 top Asian teams.

We highly recommend you check out what these young entrepreneurs came up with this year. Be inspired by their creativity!

 

Recommended Reading

[adrotate group=”2″]

Go Back To > Software | Business | Home

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!


Exclusive : Dimension Data On Living In A Multi-Cloud World!

On 30 October 2018, Dimension Data held a forum on Living in a Multi-Cloud World with F5 Networks. We had the exclusive opportunity to interview three key Dimension Data executives. Find out what we learned!

 

Dimension Data On Living In A Multi-Cloud World!

The Living In A Multi-Cloud World forum had a tagline of Connect, Automate, Secure. Let’s find out from Andy Cocks (CTO of Dimension Data Asia Pacific), Sandy Woo (Solutions Director of Dimension Data Malaysia), and Neville Burdan, Director of Cybersecurity, Dimension Data Asia Pacific), exactly what they mean!

Here are some key excerpts from the interview :

  • A key problem for many organisations is their inability to hire and retain talent to manage their infrastructure, so it would be cheaper and more efficient to leverage multi-cloud platforms
  • Managing a company’s digital requirements is getting exponentially complex, so the choice is to either outsource the job, or automate the various tasks.
  • There is also a lack of awareness about the advantages of adopting a multi-cloud platform, coupled with a lack of hyperscalers in Malaysia
  • The Dimension Data Managed Cloud Platform, which launched in July 2018, attempts to bridge the gap between demand and availability of cloud computing and storage, with multi-cloud support.

 

Multi-Cloud On Dimension Data Managed Cloud Platform

The Dimension Data Managed Cloud Platform, introduced in Malaysia in July 2018, is multi-cloud capable. Its Managed Services Operation Portal allows you to provision, deploy, monitor, and manage multiple instances from other cloud service providers like Microsoft Azure and Amazon Web Services.

Enterprise-Grade Security & Reliability

The Dimension Data Managed Cloud Platform offers enterprise-grade security and reliability, backed by a modern data centre with robust backup and disaster recovery capabilities to ensure business continuity. This allows their clients to create a hybrid environment that is highly scalable and responsive to their business needs, without worrying about security or reliability.

Compliance & Certification

The Dimension Data Managed Cloud Platform is certified to be compliant with SOC1 Type II, SOC2 Type II, ISO:22301, ISO:27001, and ISO:9001.

SAP HANA Enterprise Cloud Premium Partner

Dimension Data is one of only five global SAP HANA Enterprise Cloud Premium Partners, so it’s no surprise that the Dimension Data Managed Cloud Platform supports SAP S/4HANA in addition to other business enterprise applications and business productivity suites.

 

Suggested Reading

[adrotate group=”2″]

Go Back To > Enterprise + Business | Home

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!

The Dimension Data Managed Cloud Platform Revealed!

Dimension Data just announced that they are expanding their enterprise-grade Managed Cloud Platform (MCP) across Asia Pacific. Here in Malaysia, they are introducing the Dimension Data Managed Cloud Platform in partnership with the NTT Group.

 

The Dimension Data Managed Cloud Platform Explained

Building on the success of their global and regional Managed Cloud Platforms, Dimension Data is introducing their MCP in Malaysia, to better serve their clients and meet local demand for enterprise-grade hybrid cloud services.

The Dimension Data Managed Cloud Platform allows clients to securely deploy their applications and workloads into the cloud, while preserving data sovereignty. This platform also comes with built-in automation to help clients manage and capitalise on the benefits of multi-cloud environments.

SAP HANA Enterprise Cloud Premium Partner

Dimension Data is one of only five global SAP HANA Enterprise Cloud Premium Partners, so it’s no surprise that the Dimension Data Managed Cloud Platform supports SAP S/4HANA in addition to other business enterprise applications and business productivity suites.

Compliance & Certification

The Dimension Data Managed Cloud Platform is certified to be compliant with SOC1 Type II, SOC2 Type II, ISO:22301, ISO:27001, and ISO:9001.

Multi-Cloud Capable

Dimension Data Managed Cloud Platform is multi-cloud capable. Its Managed Services Operation Portal allows you to provision, deploy, monitor, and manage multiple instances from other cloud service providers like Microsoft Azure and Amazon Web Services.

Enterprise-Grade Security & Reliability

Partnering with the NTT Group allows the Dimension Data Managed Cloud Platform to deliver enterprise-grade security and reliability. They tout a modern data centre with robust backup and disaster recovery capabilities to ensure business continuity. This allows their clients to create a hybrid environment that is highly scalable and responsive to their business needs, without worrying about security or reliability.

 

Suggested Reading

[adrotate group=”2″]

Go Back To > Enterprise + Business | Home

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!

Progenet Cloud To Be Powered By New Data Center

Key Alliance Group (KAG) today unveiled the site of their new state-of-the-art data center that will power their Progenet Cloud service. This is part of their move to take advantage of the new Digital Free Trade Zone (DFTZ) announced by the Malaysian government.

 

Progenet Cloud

KAG acquired Progenet Sdn. Bhd, a boutique cloud service provider, and rebranded it as Progenet Innovations Sdn. Bhd. (PGI).

In conjunction with this acquisition, KAG began construction of a dedicated data center on the 4th floor of Menara Lien Hoe. This data center is expected to be completed in March 2018.

When completed, Progenet Cloud will be the first end-to-end cloud service in Malaysia. They wll own the real estate, the data center and the cloud service. Until then, Progenet Cloud is hosted at the AIMS and Equinix data centers.

[adrotate group=”2″]

The new data center will be a carrier-neutral data center, with 10,000 square feet per floor, supporting up to 160 racks. It will be powered by Vertiv’s SmartAisle – a row-based enclosure system that combines racks, power, cooling and infrastructure management. KAG’s other partners in creating the new data center include Kaspersky and BitGlass.

The future data center is designed for high-density web scale infrastructure, allowing PGI to offer both public and private cloud services. This facility also comes with the latest Data Center Infrastructure Management (DCIM) solution, allowing for more efficient management, better risk management and increased efficiency.

Once the new facility is operational next year, Progenet Innovations will be able to offer higher levels of service, better disaster recovery options, and new cloud computing services.

Go Back To > Events | Home

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!

The AWS Masterclass on Artificial Intelligence by Olivier Klein

Just before we flew to Computex 2017, we attended the AWS Masterclass on Artificial Intelligence. It offered us an in-depth look at AI concepts like machine learning, deep learning and neural networks. We also saw how Amazon Web Services (AWS) uses all that to create easy-to-use tools for developers to create their own AI applications at low cost and virtually no capital outlay.

 

The AWS Masterclass on Artificial Intelligence

AWS Malaysia flew in Olivier Klein, the AWS Asia Pacific Solutions Architect, to conduct the AWS Masterclass. During the two-hour session, he conveyed the ease by which the various AWS services and tools allow virtually anyone to create their own AI applications at lower cost and virtually no capital outlay.

The topic on artificial intelligence is rather wide-ranging, covering from the basic AI concepts all the way to demonstrations on how to use AWS services like Amazon Polly and Amazon Rekognition to easily and quickly create AI applications. We present to you – the complete AWS Masterclass on Artificial Intelligence!

The AWS Masterclass on AI is actually made up of 5 main topics. Here is a summary of those topics :

Topic Duration Remark
AWS Cloud and An Introduction to Artificial Intelligence, Machine Learning, Deep Learning 15 minutes An overview on Amazon Web Services and the latest innovation in the data analytics, machine learning, deep learning and AI space.
The Road to Artificial Intelligence 20 minutes Demystifying AI concepts and related terminologies, as well as the underlying technologies.

Let’s dive deeper into the concepts of machine learning, deep learning models, such as the neural networks, and how this leads to artificial intelligence.

Connecting Things and Sensing the Real World 30 minutes As part of an AI that aligns with our physical world, we need to understand how Internet-of-Things (IoT) space helps to create natural interaction channels.

We will walk through real world examples and demonstration that include interactions with voice through Amazon Lex, Amazon Polly and the Alexa Voice Services, as well as understand visual recognitions with services such as Amazon Rekognition.

We will also bridge this with real-time data that is sensed from the physical world via AWS IoT.

Retrospective and Real-Time Data Analytics 30 minutes Every AI must continuously “learn” and be “trained”” through past performance and feedback data. Retrospective and real-time data analytics are crucial to building intelligence model.

We will dive into some of the new trends and concepts, which our customers are using to perform fast and cost-effective analytics on AWS.

In the next two pages, we will dissect the video and share with you the key points from each segment of this AWS Masterclass.

Next Page > Introduction To AWS Cloud & Artificial Intelligence, The Road To AI

[adrotate group=”1″]

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!

The AWS Masterclass on AI Key Points (Part 1)

Here is an exhaustive list of key takeaway points from the AWS Masterclass on Artificial Intelligence, with their individual timestamps in the video :

Introduction To AWS Cloud

  • AWS has 16 regions around the world (0:51), with two or more availability zones per region (1:37), and 76 edge locations (1:56) to accelerate end connectivity to AWS services.
  • AWS offers 90+ cloud services (3:45), all of which use the On-Demand Model (4:38) – you pay only for what you use, whether that’s a GB of storage or transfer, or execution time for a computational process.
  • You don’t even need to plan for your requirements or inform AWS how much capacity you need (5:05). Just use and pay what you need.
  • AWS has a practice of passing their cost savings to their customers (5:59), cutting prices 61 times since 2006.
  • AWS keeps adding new services over the years (6:19), with over a thousand new services introduced in 2016 (7:03).
[adrotate group=”1″]

Introduction to Artificial Intelligence, Machine Learning, Deep Learning

  • Artificial intelligence is based on unsupervised machine learning (7:45), specifically deep learning models.
  • Insurance companies like AON use it for actuarial calculations (7:59), and services like Netflix use it to generate recommendations (8:04).
  • A lot of AI models have been built specifically around natural language understanding, and using vision to interact with customers, as well as predicting and understanding customer behaviour (9:23).
  • Here is a quick look at what the AWS services management console looks like (9:58).
  • This is how you launch 10 compute instances (virtual servers) in AWS (11:40).
  • The ability to access multiple instances quickly is very useful for AI training (12:40), because it gives the user access to large amounts of computational power, which can be quickly terminated (13:10).
  • Machine learning, or specifically artificial intelligence, is not new to Amazon.com, the parent company of AWS (14:14).
  • Amazon.com uses a lot of AI models (14:34) for recommendations and demand forecasting.
  • The visual search feature in Amazon app uses visual recognition and AI models to identify a picture you take (15:33).
  • Olivier introduces Amazon Go (16:07), a prototype grocery store in Seattle.
[adrotate group=”1″]

The Road to Artificial Intelligence

  • The first component of any artificial intelligence is the “ability to sense the real world” (18:46), connecting everything together.
  • Cheaper bandwidth (19:26) now allows more devices to be connected to the cloud, allowing more data to be collected for the purpose of training AI models.
  • Cloud computing platforms like AWS allow the storage and processing of all that sensor data in real time (19:53).
  • All of that information can be used in deep learning models (20:14) to create an artificial intelligence that understands, in a natural way, what we are doing, and what we want or need.
  • Olivier shows how machine learning can quickly solve a Rubik’s cube (20:47), which has 43 quintillion unique combinations.
  • You can even build a Raspberry Pi-powered machine (24:33) that can solve a Rubik’s cube puzzle in 0.9 seconds.
  • Some of these deep learning models are available on Amazon AI (25:11), which is a combination of different services (25:44).
  • Olivier shows what it means to “train a deep learning model” (28:19) using a neural network (29:15).
  • Deep learning is computationally-intensive (30:39), but once it derives a model that works well, the predictive aspect is not computationally-intensive (30:52).
  • A pre-trained AI model can be loaded into a low-powered device (31:02), allowing it to perform AI functions without requiring large amounts of bandwidth or computational power.
  • Olivier demonstrates the YOLO (You Only Look Once) project, which pre-trained an AI model with pictures of objects (31:58), which allows it to detect objects in any video.
  • The identification of objects is the baseline for autonomous driving systems (34:19), as used by Tu Simple.
  • Tu Simple also used a similar model to train a drone to detect and follow a person (35:28).

Next Page > Sensing The Real World, Retrospective & Real-Time Analysis

[adrotate group=”1″]

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!

The AWS Masterclass on AI Key Points (Part 2)

Connecting Things and Sensing the Real World

  • Cloud services like AWS IoT (37:35) allow you to securely connect billions of IoT (Internet of Things) devices.
  • Olivier prefers to think of IoT as Intelligent Orchestrated Technology (37:52).
  • Olivier demonstrates how the combination of multiple data sources (maps, vehicle GPS, real-time weather reports) in Bangkok can be used to predict traffic as well as road conditions to create optimal routes (39:07), reducing traffic congestion by 30%.
  • The PetaBencana service in Jakarta uses picture recognition and IoT sensors to identify flooded roads (42:21) for better emergency response and disaster management.
  • Olivier demonstrates how easy it is to connect an IoT devices to the AWS IoT service (43:46), and use them to sense the environment and interact with.
  • Olivier shows how the capabilities of the Amazon Echo can be extended by creating an Alexa Skill using the AWS Lambda function (59:07).
  • Developers can create and publish Alexa Skills for sale in the Amazon marketplace (1:03:30).
  • Amazon Polly (1:04:10) renders life-like speech, while the Amazon Lex conversational engine (1:04:17) has natural language understanding and automatic speech recognition. Amazon Rekognition (1:04:29) performs image analysis.
  • Amazon Polly (1:04:50) turns text into life-like speech using deep learning to change the pitch and intonation according to the context. Olivier demonstrates Amazon Polly’s capabilities at 1:06:25.
  • Amazon Lex (1:11:06) is a web service that allows you to build conversational interfaces using natural language understanding (NLU) and automatic speech recognition (ASR) models like Alexa.
  • Amazon Lex does not just support spoken natural language understanding, it also recognises text (1:12:09), which makes it useful for chatbots.
  • Olivier demonstrates that text recognition capabilities in a chatbot demo (1:13:50) of a customer applying for a credit card through Facebook.
  • Amazon Rekognition (1:21:37) is an image recognition and analysis service, which uses deep learning to identify objects in pictures.
  • Amazon Rekognition can even detect facial landmarks and sentiments (1:22:41), as well as image quality and other attributes.
  • You can actually try Amazon Rekognition out (1:23:24) by uploading photos at CodeFor.Cloud/image.
[adrotate group=”1″]

Retrospective and Real-Time Data Analytics

  • AI is a combination of 3 types of data analytics (1:28:10) – retrospective analysis and reporting + real-time processing + predictions to enable smart apps.
  • Cloud computing is extremely useful for machine learning (1:29:57) because it allows you to decouple storage and compute requirements for much lower costs.
  • Amazon Athena (1:31:56) allows you to query data stored in Amazon S3, without creating a compute instance to do it. You only pay for the TB of data that is processed by that query.
  • Best of all, you will get the same fast results even if your data set grows (1:32:31), because Amazon Athena will automatically parallelise your queries across your data set internally.
  • Olivier demonstrates (1:33:14) how Amazon Athena can be used to run queries on data stored in Amazon S3, as well as generate reports using Amazon QuickSight.
  • When it comes to data analytics, cloud computing allows you to quickly bring massive computing power to bear, achieving much faster results without additional cost (1:41:40).
  • The insurance company AON used this ability (1:42:44) to reduce an actuarial simulation that would normally take 10 days, to just 10 minutes.
  • Amazon Kinesis and Amazon Kinesis Analytics (1:45:10) allows the processing of real-time data.
  • A company called Dash is using this capability to analyse OBD data in real-time (1:47:23) to help improve fuel efficiency and predict potential breakdowns. It also notifies emergency services in case of a crash.

Go Back To > First PageArticles | Home

[adrotate group=”1″]

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!

AWS Summit 2017 – What’s New In Amazon Web Services

On 18 April 2017, the One World Hotel was besieged by a massive crowd. One might have thought they were there for a rock concert. They were really there for the Amazon Web Services Summit 2017. Join us at AWS Summit 2017 and find out what’s new in Amazon Web Services!

 

The AWS Summit 2017

With 2 keynotes and over 20 technology sessions, the AWS Summit 2017 was a great opportunity for IT managers and professionals to get updated on the latest AWS services, and what they have in the pipeline.

The highlight of the AWS Summit 2017 was a 90-minute keynote by Adrian Cockcroft, Vice President of Cloud Architecture Strategy, Amazon Web Services.

Here are some key takeaways from his presentation :

  • Amazon Web Services is adding new capabilities on a daily basis, with over a thousand in 2016.
  • Amazon will introduce Lightsail, a simple VPS service, to the Singapore AWS Region in the next few weeks.
  • Amazon Athena allows you to quickly query data stored in S3, whether it is compressed and/or encrypted. It will also be available in the Singapore AWS Region in the next few weeks.
  • Amazon Connect is a cloud-based contact center solution that is available today. It leverages Amazon Lex for natural language understanding and automatic speech recognition, and AWS Lambda for data and business intelligence.[adrotate group=”2″]
  • AWS also announced the Amazon Aurora PostgreSQL-Compatible Edition service, which is currently in developer preview. It promises to offer several times better performance than a typical PostgreSQL database at 1/10th of the cost.
  • AWS Lambda just introduced support for Node.js 6.10 and C#, AWS Serverless Application Model and Environment Variables.
  • The existing AWS DDOS protection has been branded as AWS Shield. It protects all web applications from volumetric and state exhaustion attacks.
  • The new AWS Shield Advanced service is designed to protect enterprises against more sophisticated attacks. It includes advanced notifications and cost protection, as well as WAF (Web Application Firewall) at no additional cost.

Go Back To > Enterprise | Home

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!

NVIDIA GRID Boosts Blast Extreme in VMware Horizon

What IT department wouldn’t want to be able to do more with their current infrastructure? With NVIDIA GRID, they can.

NVIDIA GRID acceleration of Blast Extreme — a new protocol for optimizing the mobile cloud — is now supported in VMware Horizon 7. What’s that mean for IT managers? Reduced latency. Improved performance. And up to 18 percent more users.

NVIDIA and VMware have been working together for years to improve the virtualized computing user experience and enable a whole new class of virtual use cases. We were the first to enable hardware-accelerated graphics rendering in VMware Horizon View. Then we enabled the first virtualized graphics acceleration in Horizon with GRID vGPU.

Now, using the new Blast Extreme protocol, NVIDIA GRID offloads encoding from the CPU to the GPU. This frees up resources and lowers the demand on network infrastructure, which lets organizations reach more remote users. In tests of key applications like ESRI ArcGIS Pro, scalability increased by up to 18 percent. That’s without investing in new hardware.

 

Bigger Graphical Workloads, Better Performance

Using VMware Horizon 7 with NVIDIA Blast Extreme Acceleration, organizations can now increase the number of graphical workloads per server while delivering a superior user experience to the most remote networks.

[adrotate banner=”4″]Testing has shown latency improvements of up to 50 milliseconds, and reduced bandwidth requirements ranging from 19 to 89 percent. This increases network tolerance for graphical workloads and allows more use cases to take advantage of accelerated virtual desktops and applications.

“NVIDIA GRID with VMware Horizon extends the performance of existing infrastructure so customers can meet the needs of their modern, remote workforces,” said Pat Lee, senior director of product management for End-User Computing, at VMware. “This offering puts the world’s most powerful applications and the best user experience on every device, at scale, anywhere in the world.”

We have a long track record of working with VMware to expand the use of graphics-accelerated virtualized desktops and applications. Companies worldwide have adopted our combined solution to unlock workforce productivity and deliver incredible mobility, flexibility and user experience, all from the cloud.

 

Support Tech ARP!

If you like our work, you can help support out work by visiting our sponsors, participate in the Tech ARP Forums, or even donate to our fund. Any help you can render is greatly appreciated!

Microsoft Philanthropies Donates US$1 Billion In Resources

DAVOS-KLOSTERS, Switzerland — Jan. 19, 2016 Microsoft Corp. CEO Satya Nadella announced a new three-part initiative to ensure that Microsoft’s cloud computing resources serve the public good. As part of this initiative the recently formed Microsoft Philanthropies will donate $1 billion of Microsoft Cloud Services, measured at fair market value, to serve nonprofits and university researchers over the next three years.

 

Microsoft Philanthropies Donates US$1 Billion In Resources

Microsoft’s three-part commitment focuses on ensuring the cloud can serve the public good in the broadest sense by providing additional cloud resources to nonprofits, increasing access for university researchers and helping solve last-mile Internet access challenges.

“Microsoft is empowering mission-driven organizations around the planet with a donation of cloud computing services — the most transformative technologies of our generation,” said Microsoft CEO Satya Nadella, who on Wednesday will speak at the World Economic Forum in Davos, Switzerland. “Now more than 70,000 organizations will have access to technology that will help them solve our greatest societal challenges and ultimately improve the human condition and drive new growth equally.”

Cloud computing has emerged as a vital resource for unlocking the secrets held by data in ways that create new insights and lead to breakthroughs not just for science and technology, but for the full range of economic and social challenges and the delivery of better human services. It can also improve communications and problem-solving and help organizations work in a more productive and more efficient manner.

In September 2015, 193 heads of state and other world leaders unanimously adopted 17 sustainable development goals to achieve by 2030. This ambitious agenda — which includes ending poverty, ending hunger, and ensuring affordable, reliable and sustainable energy for all — will only be achievable with the benefit of significant inventions and technology innovations. The scale and computational power enabled by cloud computing will be essential to unlocking solutions to this list of some of the world’s seemingly unsolvable problems.

“We’re committed to helping nonprofit groups and universities use cloud computing to address fundamental human challenges,” said Microsoft President Brad Smith. “One of our ambitions for Microsoft Philanthropies is to partner with these groups and ensure that cloud computing reaches more people and serves the broadest array of societal needs.”

Specific elements of the new initiative include these:

  • Serving the broad needs of the nonprofit community. A new global donation program will make Microsoft Cloud Services, including Microsoft Azure, Power BI, CRM Online and the Enterprise Mobility Suite, more available to nonprofit organizations through Microsoft Philanthropies. The program builds upon an already successful program that provides similar access to Office 365 for nonprofits. The nonprofit program for Microsoft Cloud Services will begin rolling out this spring, and Microsoft Philanthropies aims to serve 70,000 nonprofits in the next three years with these Microsoft Cloud Services.
  • Expanding access to cloud resources for faculty research in universities. Microsoft Research and Microsoft Philanthropies will expand by 50 percent the Microsoft Azure for Research program that grants free Azure storage and computing resources to help faculty accelerate their research on cutting-edge challenges. Today this program provides free cloud computing resources for over 600 research projects on six continents.
  • Reaching new communities with last-mile connectivity and cloud services. Microsoft Philanthropies and Microsoft Business Development will combine donated access to Microsoft Cloud services with investments in new, low-cost last-mile Internet access technologies and community training. By combining cloud services with connectivity and training, and focusing on new public-private partnerships, Microsoft Philanthropies intends to support 20 of these projects in at least 15 countries around the world by the middle of 2017.

Providing nonprofits with better access to Microsoft Cloud Services, including the powerful Microsoft Azure platform, builds upon Microsoft’s longtime commitment to making cutting-edge technology available at no or low cost to organizations working on solving some of society’s toughest problems.

In recent years, as organizations have increased their reliance on cloud computing, Microsoft has worked in partnership with a broad range of organizations focused on big challenges. The initiatives show the potential impact that increased access to the transformational power of cloud computing can have:

  • Microsoft Research is working with the São Paulo Research Foundation (FAPESP) Biodiversity Research Program through the use of 700 wireless sensors, cloud technology and automated data-stream processing to understand how cloud forests work and study the impact of climate changes on the communities supported by those forests.
  • [adrotate group=”2″]Through a partnership with the University of Texas at Austin called Project Catapult, Microsoft makes advanced cloud computing technology available to researchers that have demonstrated the ability to deliver lower power and cost, higher-quality results, or a combination of both.
  • In Botswana, Microsoft is partnering with the Botswana Innovation Hub, Vista Life Sciences, the United States Agency for International Development and Global Broadband Solutions to assist Botswana, the University of Pennsylvania and the Ministry of Health in leveraging cloud-based health records management and Internet access enabled by use of TV white spaces to remotely deliver specialized medicine, including cervical cancer screenings to women at rural healthcare clinics.

“Access to technology is critical to the operations and services of NetHope and its 44 humanitarian nonprofit member organizations,” said NetHope CEO Lauren Woodman. “The power of cloud computing will create exponential value for all we do to serve the millions of people in our communities around the world.”

Go Back To > Enterprise | Home

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!

The AMD Opteron A1100 Technology Report

AMD today officially launched the AMD Opteron A1100 System-on-Chip (SoC). Formerly codenamed “Seattle“, the AMD Opteron A1100 is the first AMD device to use the 64-bit ARMv8-A processor microarchitecture, instead of the usual x86 microarchitecture.

The ARMv8-A microarchitecture allows for a more energy-efficient device, but their unfamiliarity with it may have been the reason why the Opteron A1100 was delayed almost 18 months.Today’s launch marks the end of that protracted development process.

 

The AMD Opteron A1100 Revealed

The AMD Opteron A1100 was designed around the ARM Cortex-A57 processor cores to deliver better power efficiency and lower costs. Coupled with enterprise-class features, AMD is hoping that it will allow them to penetrate the datacenter or hyperscale computing market.

The AMD Opteron A1100 processor will feature up to 8 ARM Cortex-A57 processor cores, backed by up to 4 MB of shared L2 cache and a large 8 MB L3 cache. Its dual-channel memory controller supports up to 128 GB of DDR3 or DDR4 memory.

Connectivity-wise, it supports two 1o Gb Ethernet ports, a single PCI Express 3.0 slot and up to 14 SATA3 devices. Take a look at the Opteron A1100 chip diagram

Here are the market segments that AMD plans to target with the Opteron A1100 SoC. Basically anyone who wants a cheap, efficient SoC for massive deployments.

[adrotate group=”1″]

To support those solutions, AMD has already prepared a complete solution stack. This is what probably took them so long to qualify the Opteron A1100 for the enterprise market.

In the next page, we will take a look at the three new Opteron A1100 models AMD launched today… Click here!

Next Page > AMD Opteron A1100 Model, Package, Memory Support

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!

AMD Opteron A1100 Model & Package Details

AMD will initially offer 3 models – the AMD Opteron A1170, the AMD Opteron A1150 and the AMD Opteron A1120. Here are their key specifications :

Whether they come with 4 or 8 processor cores, all Opteron A1100 processors will have the same SP1 package, because they are all the same chip. The quad-core models will just have half the cores disabled.

[adrotate group=”1″]

 

AMD Opteron A1100 Memory Support

To cater to the enterprise market, the AMD Opteron A1100 will support both DDR3 and DDR4 memory in three different interfaces – UDIMM, RDIMM and SO-DIMM.

Basically, the AMD Opteron A1100 will support memory speeds up to 1866 MHz for DDR4 memory, and 1600 MHz for DDR3 memory. You can run them in single- or dual-channel modes using up to 4 memory modules.

Go Back To > First PageArticles | Home

 

Support Tech ARP!

If you like our work, you can help support our work by visiting our sponsors, participating in the Tech ARP Forums, or even donating to our fund. Any help you can render is greatly appreciated!