1. Weekly Assignment 4
ADVE 709 Creative Strategies
Professor Gauri Misra-Deshpande
Maggie Walker
Tech and Creativity: Research any 3 new technologies that are not yet mainstream but show
potential of becoming customer facing. Write a 500 word paragraph for each technology, who
has developed it and what kind of backing it has received.
5G Telecommunications
5G is the fifth generation mobile network. It is a new global wireless standard after 1G, 2G, 3G,
and 4G networks. 5G enables a new kind of network that is designed to connect virtually
everyone and everything together including machines, objects, and devices. 5G wireless
technology is meant to deliver higher multi-Gbps peak data speeds, ultra low latency, more
reliability, massive network capacity, increased availability, and a more uniform user experience
to more users. Higher performance and improved efficiency empower new user experiences
and connects new industries.
No one company/person invented or owns 5G, but there are several companies within the
mobile ecosystem that are contributing to bringing 5G to life. Qualcomm has played a major role
in inventing the many foundational technologies that drive the industry forward and make up 5G,
the next wireless standard. The 3rd Generation Partnership Project (3GPP) is the industry
organization that defines the global specifications for 3G UMTS, 4G LTE, and 5G technologies.
3GPP is driving many essential inventions across all aspects of 5G design, from the air interface
to the service layer. Other 3GPP 5G members range from infrastructure vendors and
component/device manufacturers to mobile network operators and vertical service providers.
With 4G wireless communications less than 10 years old, it seems odd to be discussing
replacing it, but it is happening with 5G, the emerging standard in voice and data
telecommunications. 1G was analog, 2G was digital for higher-quality voice, 3G started to
provide higher rates allowing for more data-oriented applications, and 4G has allowed for the
ongoing growth in mobile applications and video over mobile.
The 5G specs aim squarely at the Internet of Things, with an eye toward supporting the millions
of sensors that scientists expect to see deployed in support of smart homes, smart buildings
and smart cities. 5G will have to support far greater density of connected devices. It will also
have to provide much lower end-to-end latencies than today’s cellular networks. Bhaskar
Krishnamachari is a professor of Electrical Engineering and Computer Science and director of
the Center for Cyber-Physical Systems and the Internet of Things at the USC Viterbi School of
Engineering. His team is exploring the likely interactions of 5G-supported sensors in urban
2. settings, where networks will need to handle data on such diverse phenomena as traffic flows,
air quality and noise pollution, disasters, security incidents, and crowds.
“5G will enable much greater capabilities across a wide range of problems,” said Darrell M.
West, vice president and director of governance studies at the Brookings Institution. “It will be
faster, and there will also be more intelligent management of the network. You can have millions
of sensors but you also need the means to deal with the flood of information that comes out of
that. 5G includes much more advanced data analytics and network management.”
While scientists understand the technology driving 5G, actual implementation remains a few
years off, with likely rollouts beginning in the 2020 time frame. There are questions of spectrum
allocation that need to be resolved, among other issues. Local government will likely be called
upon to play a part in any eventual deployment.
Hybrid Cloud Infrastructure
Hybrid cloud refers to a mixed computing, storage, and services environment made up
on-premises infrastructure, private cloud services, and a public cloud - such as Amazon Web
Services or Microsoft Azure - with orchestration among the various platforms. Using a
combination of public clouds, on-premises computing, and private clouds in your data center
means that you have a hybrid cloud infrastructure.
The underlying concept of cloud computing goes back to the 1960s and the concept of an
“Intergalactic Computer Network” by Joseph Carl Robnett Licklider, known as Lick. Lick had an
unusual background of psychologist and computer scientist and was referred to as computing’s
Johnny Appleseed. His concept of an “Intergalactic Computer Network” at the Advanced
Research Projects Agency created by Eisenhower, became the basis for ARPANET, which
became the Internet. In many ways cloud computing actually evolved from core concepts that
are very mainframe centric. The biggest step towards the current concept we call cloud
computing was the birth of Salesforce.com in 1999. This showcased the concept of delivering
enterprise applications from the Internet then, the cloud now, from a simple website and it was a
huge success. Amazon Web Services was developed in 2002 and continued to evolve until
2006 when it launched its Elastic Compute cloud, which allowed small companies and
individuals to rent computer space where they could run their applications. Google Apps was
launched in 2009. Assisting behind the scenes was the concept of virtualization provided by
firms like VMware and Microsoft, which allowed single servers to not only host multiple
application loads but to be able to shift those loads dynamically and in real-time allowing these
cloud implementations to both scale and to become increasingly less expensive. The rapid rise
of capability and reduction in cost is largely what continues to drive cloud computing today.
While it may appear that a Hyperconverged infrastructure’s architectural design to simplify
management of a traditional 3-tier IT stack through a combination of storage, compute,
3. networking, and hypervisor was a tremendous leap forward, it still had room to grow. Adding a
multi cloud environment to create the Hybrid Cloud Infrastructure’s extension of public clouds
back into a customer’s data center is the next evolutionary step. Benefits to the hybrid cloud
include that it is connected to the data fabric, it is a part of the hybrid multi cloud experience,
and it has partnerships with cloud providers. Data fabric is an architecture and set of data
services that provide consistent capabilities across a choice of endpoints spanning on-premises
and multiple cloud environments. Data Fabric simplifies and integrates data management to
excel workloads on only hyper converged infrastructures to a Hybrid Cloud Infrastructure.
Combined with the data fabric, Hybrid Cloud Infrastructure is elevated to a new experience;
bringing together the best of public cloud and private cloud for a consistent user journey. Only
organizations with NetApp HCI will be empowered to increase productivity, maintain simplicity,
and deliver more services at scale. Partnerships with cloud providers foster a true degree of
choice so the best strategies can be implemented with the right provider.
It has been known for a while that businesses are increasingly moving toward a hybrid cloud
infrastructure. From SaaS applications and on-prem solutions to a mix of public and private
organizations strike the right balance for their unique cloud infrastructure needs. Over the past
year, major investments in hybrid have been made from large public cloud providers like AWS,
Azure, Google, IBM and Oracle. We are also seeing OEMs like HPE, Dell and Cisco increasing
investment in building tools that enable simpler connectivity between on-premises data centers
and cloud. These investments are all about meeting the customer where they are at the
moment. Addressing the challenges of exponential data growth, while also being proactive on
issues like privacy, security and compliance. The modernized approach to hybrid cloud is
expanding from traditional IT to support industrial applications as well. For instance, Honeywell,
has built its Forge IoT platform using an open source and hybrid cloud approach so the
industrial data it manages can more seamlessly integrate with traditional cloud datacenters,
applications and workloads.
Headless Technology
Headless technology means that businesses are now able to separate their front-end
presentation layer from their back-end data functionality to create custom shopping
experiences. This can be as simple as telling your Amazon Alexa to replenish your favorite
coffee or being able to make instant purchases off of social media. In short, people are doing a
lot more of this type of commerce. Research shows that 86% of businesses say their customer
acquisition costs have increased in the last 24 months. First, the organizations need to
maximize the ROI of their net new customer acquisition costs. Second, it is more important than
ever to also focus on customer development and retention. By moving beyond the omnichannel
experience to connect everything from warehouses to storefronts to online services, companies
in 2021 could become more efficient, more streamlined, and possibly get a leg up on
competitors if they adopt it a little faster than they do.
4. A headless CMS is a back-end only content management system built from the ground up as a
content repository that makes content accessible via a RESTful API for display on any device.
The term “headless” comes from the concept of chopping the “head” (the front end, i.e. the
website) off of the “body” (the back end, i.e. the content repository). A headless CMS remains
with an interface to add content and a RESTful API to deliver content wherever you need it. Due
to the approach, a headless CMS does not care about how and where your content is
displayed. Its one focus is storing and delivering structured content.
Cases for using a headless CMS include: building a website with a technology you are familiar
with, websites, web apps that use JavaScript frameworks, websites created with static site
generators, native mobile apps, and enrich product information on ecommerce sites. A headless
CMS can deliver your content through an API directly to where you need it. Because of the
headless approach the content can be used on an iOS app, Android app as well as any platform
and technology you can think of and is therefore a powerful option for mobile and web
developers.
A headless CMS provides all of the capabilities of the backend of a traditional CMS, while giving
responsibility of content presentation/layout to the delivery channels. Content is no longer
pushed out to a channel in a predefined manner. Content is pulled or requested from the CMS
by any channel, by way of their own unique presentation capabilities. A RESTful API is an
application programming interface that uses the internet protocol HTTP to request data for a
data source - the CMS in the situation. In a pure-play headless situation, the headless CMS
does not generate any front-end code, and provides content as a service, which is why
headless CMS is sometimes referred to as “Content-as-a-Service.” This process results in the
best available digital experience for the end users of a particular device since front-end
developers are able to continue developing new functionality for any channel independent of the
core/backend CMS. Headless CMS can also be defined as only having the ability for creation,
reading, updating and deleting of content, while downstream channels that pull content from the
headless CMS are responsible for its presentation to end users.