Recap on storytelling.
We analyze the current landscape, starting from Cluetrain Manifesto, through some definitions (social networks, networked publics).
How we can create an effective message: personalization, groups, behaviours, communities, immediacy, perfect timing, different techniques and styles.
Then some essential rules, regarding listen and conversation, the blur between public and private, goals.
The age of artificial intelligence, deep dives on machine learning and deep learning. Machine perception and applications. How company use AI in their businesses. Case study: Netflix.
Recap on storytelling.
We analyze the current landscape, starting from Cluetrain Manifesto, through some definitions (social networks, networked publics).
How we can create an effective message: personalization, groups, behaviours, communities, immediacy, perfect timing, different techniques and styles.
Then some essential rules, regarding listen and conversation, the blur between public and private, goals.
Overview on data collection methods and a deep dive on data (primary Vs secondary, qualitative and quantitative). Bias. Data processing and structured, unstructured, semistructured data. Databases jargon.
Linked Data and examples, why they matter. Data driven strategies. Data mining: laws and applications. Data aggregation and fundamentals of data representation (table, bar chart, histogram, pie chart, line graph, scatter plot). Data science definition and job roles (who does what).
Qualitative data definition and examples. Qualitative metaphors. Data visualization & journalism. Common kinds: mind maps, flow diagrams, words cloud, user journey, tube map, maps. Qualitative chart chooser
Linked Data and examples, why they matter. Data driven strategies. Data mining: laws and applications. Data aggregation and fundamentals of data representation (table, bar chart, histogram, pie chart, line graph, scatter plot). Data science definition and job roles (who does what).
Visual communication of qualitative data (v. 2020 ITA)Frieda Brioschi
Qualitative data definition and examples. Qualitative metaphors. Data visualization & journalism. Common kinds: mind maps, flow diagrams, words cloud, user journey, tube map, maps. Qualitative chart chooser
How to collect and organize data (v. ITA 2020)Frieda Brioschi
Overview on data collection methods and a deep dive on data (primary Vs secondary, qualitative and quantitative). Bias. Data processing and structured, unstructured, semistructured data. Example of personal data tracking.
Recap on storytelling.
We analyze the current landscape, starting from Cluetrain Manifesto, through some definitions (social networks, networked publics).
How we can create an effective message: personalization, groups, behaviours, communities, immediacy, perfect timing, different techniques and styles.
Then some essential rules, regarding listen and conversation, the blur between public and private, goals.
Overview on data collection methods and a deep dive on data (primary Vs secondary, qualitative and quantitative). Bias. Data processing and structured, unstructured, semistructured data. Databases jargon.
Linked Data and examples, why they matter. Data driven strategies. Data mining: laws and applications. Data aggregation and fundamentals of data representation (table, bar chart, histogram, pie chart, line graph, scatter plot). Data science definition and job roles (who does what).
Qualitative data definition and examples. Qualitative metaphors. Data visualization & journalism. Common kinds: mind maps, flow diagrams, words cloud, user journey, tube map, maps. Qualitative chart chooser
Linked Data and examples, why they matter. Data driven strategies. Data mining: laws and applications. Data aggregation and fundamentals of data representation (table, bar chart, histogram, pie chart, line graph, scatter plot). Data science definition and job roles (who does what).
Visual communication of qualitative data (v. 2020 ITA)Frieda Brioschi
Qualitative data definition and examples. Qualitative metaphors. Data visualization & journalism. Common kinds: mind maps, flow diagrams, words cloud, user journey, tube map, maps. Qualitative chart chooser
How to collect and organize data (v. ITA 2020)Frieda Brioschi
Overview on data collection methods and a deep dive on data (primary Vs secondary, qualitative and quantitative). Bias. Data processing and structured, unstructured, semistructured data. Example of personal data tracking.
This document outlines the process of data-informed decision making. It discusses acquiring relevant data from internal and external sources, analyzing the data to find patterns and relationships, applying personal expertise when interpreting the data, announcing and implementing decisions to stakeholders, and assessing the outcomes of decisions to continually improve the process. The goal is to formulate questions, leverage different types of data analysis, make evidence-based decisions, and monitor their impacts over time.
Online Data Preprocessing: A Case Study ApproachIJECEIAES
Besides the Internet search facility and e-mails, social networking is now one of the three best uses of the Internet. A tremendous number of volunteers every day write articles, share photos, videos and links at a scope and scale never imagined before. However, because social network data are huge and come from heterogeneous sources, the data are highly susceptible to inconsistency, redundancy, noise, and loss. For data scientists, preparing the data and getting it into a standard format is critical because the quality of data is going to directly affect the performance of mining algorithms that are going to be applied next. Low-quality data will certainly limit the analysis and lower the quality of mining results. To this end, the goal of this study is to provide an overview of the different phases involved in data preprocessing, with a focus on social network data. As a case study, we will show how we applied preprocessing to the data that we collected for the Malaysian Flight MH370 that disappeared in 2014.
This document provides an overview of data mining and data aggregation basics. It discusses key concepts such as the phases of the data mining process according to the CRISP-DM framework which includes business understanding, data understanding, data preparation, modeling, evaluation, and deployment. It also discusses different types of data aggregation including time and spatial aggregation and summarization techniques such as calculating the mean, count, maximum, minimum, mode, range, and sum. Additionally, it presents different ways of visualizing data including tables, bar charts, histograms, pie charts, and line graphs.
The Design of an Online Social Network Site for Emergency Management: A One-S...guest636475b
Web 2.0 is creating new opportunities for communication and collaboration. Part of this explosion is the increase in popularity and use of Social Network Sites (SNSs) for general and domain-specific use. In the emergency domain there are a number of websites, wikis, SNSs, etc. but they stand as silos in the field, unable to allow for cross-site collaboration. In this paper we describe ongoing design science research to develop and refine guiding principles for developing an SNS that will bring together emergency domain professionals in a “one-stop-shop.” We surveyed emergency professionals who study crisis information systems, to ascertain potential functionalities of such an SNS. Preliminary results suggest that there is a need for the envisioned SNS. Future research will continue to explore possible solutions to issues addressed in this paper.
The Potential and Challenges of Today's AIBohyun Kim
A preconference presentation given by Bohyun Kim, CTO and Associate Professor, University of Rhode Island Libraries, at the NISO Plus Conference at Baltimore, MD on February 23, 2020.
Einstein published his ideas and became a pivotal element in shifting the way we think about physics - from the Newtonian model to the Quantum - in turn this changed the way we think about the world and allowed us to develop new ways of engaging with the world.
We are at a similar juncture. The development of computational technologies allows us to think about astronomical volumes of data and to make meaning of that data.
The mindshift that occurs is that “the machine is our friend”. The computer, like all machines, extends our capabilities. As a consequence the types of thinking now required in industry are those that get away from thinking like a computer and shift towards creative engagement with possibilities. Logical thinking is still necessary but it starts to be driven by imagination.
Computational thinking and data science change the way we think about defining and solving problems.
The age of creativity - which increasingly extends its impact from arts applications to business, scientific, technological, entrepreneurship, political, and other contexts.
IRE "Better Watchdog" workshop presentation "Data: Now I've got it, what do I...J T "Tom" Johnson
The document discusses analyzing data that has been collected through investigative journalism projects. It provides tips on storing data in the cloud and bookmarking tools, challenges in analyzing poorly formatted government data from New Mexico's transparency portal, and strategies for analyzing both qualitative and quantitative data through tools like spreadsheets, databases, and data visualization programs. The goal is to turn collected data into useful information that can be shared through stories.
Research in Intelligent Systems and Data Science at the Knowledge Media Insti...Enrico Motta
The document discusses research directions in intelligent systems and data science. It describes work on making sense of scholarly data through techniques like data mining, semantic technologies, and machine learning. It also discusses mapping and classifying computer science research areas using an automatically generated ontology with over 14,000 topics. Other topics discussed include predicting emerging research areas, applications in smart cities like the MK:Smart project, and potential roles for robots in smart cities like an autonomous health and safety inspector.
WEB EVOLUTION - THE SHIFT FROM INFORMATION PUBLISHING TO REASONINGijaia
The Web, as communication channel, has had variety of development that allows information to be published and accessed in a scaleable approach. With the revolution of the information, some research studies have conducted to boost the present situation and propose advance version of the Web. Therefore, it is important to look into the new version of the Web in order to improve the way that information is expressed, to make more intelligent choices and to obtain a better meaning of the information over the Web. That is, future web would require specific architecture in order to support the extracting of better
meaning or "reasoning". With Web 1.0 and Web 2.0, the current information over the Web is not understandable for the machines. Understanding is big shift for wide open door for innovatoion and reasoning. In this work, we research the progress of the Web from Web 1.0, Web 2.0, Web 3.0, Web 4.0, to Web 5.0. We are pointing out document types and technologies employed to understand the changes from
Web 1.0 to Web 3.0 and to predicate the future of the Web (Web 4.0 and Web 5.0). Also, we present the current status and concerns about the Web as an information source and communication channel.
Visual communication of quantitative data (v. 2020 ITA)Frieda Brioschi
Quantitative and qualitative data recap. Visual systems and preattentive attributes. Quantitative data visualization, chart selector. Some useful tactics.
The document discusses how social software and knowledge networks are changing how knowledge is produced, distributed, and validated. It introduces concepts like downloadable beliefs, where people can search online networks to find and adopt beliefs, and folksonomies, where common users collaboratively categorize information. The future of knowledge organization is envisioned to involve more social and semantic approaches like collaborative tagging and the semantic web, which aims to enable computers to understand the meaning and relationships of online information.
The document discusses several topics related to search engines and online information, including:
1) The PageRank algorithm and its extensions over time to provide more contextually relevant search results.
2) Concerns about privacy and concentration of power as collective intelligence and user data is concentrated within large tech companies.
3) Differences in search results between engines and regions due to factors like censorship and localized information.
This article proposes a new e-portfolio system called PrPl that addresses key problems with current systems. PrPl leverages semantic indexing and cloud computing to allow users to manage their own data across applications in a sustainable way. It uses an "intelligent agent" or "butler" to track, verify, and intelligently present a user's digital artifacts stored across the cloud based on semantics. The authors argue this system could provide a customizable, responsive, and intelligent e-portfolio solution for lifelong learning. However, the article does not fully address issues of privacy, security, and cultural implications of relying on cloud infrastructure.
The document discusses the history and evolution of the Internet of Things (IoT) from its origins in the 1990s to its current applications and future potential. It explores key concepts in IoT like local sensing, data integration, and analytics of connected devices. Examples are provided of how IoT is being implemented in various sectors such as education, healthcare, retail, agriculture, and smart cities. The economic and social benefits of IoT adoption are also highlighted.
Responsive design, application development using APIs, and content strategy are hot topics in web development right now. These ideas belong to a bigger umbrella: ubiquitous computing and the role it plays in our lives. Traditional ideas of usability are undergoing dynamic changes as we move away from a desktop-first model of personal computing.
The internet refrigerator already exists and it's only the tip of the iceberg. In the near future, human-computer interactions will be thoroughly integrated into everyday objects and activities.
Postdesktop was a presentation to add clarity to responsive design as part of a larger context and to think about a shift that is changing the devices we use to access the web, the delivery method for education, the teaching and learning experience, and the whole of our lives.
Topics included a look at the role of pervasive computing:
• as it relates to responsive design
• in the classroom and textbooks
• in .edu marketing and utility on campuses
Written by Doug Gapinski and first delivered at PSU Web Conference 2012
FAIR data_ Superior data visibility and reuse without warehousing.pdfAlan Morrison
The advantages of semantic knowledge graphs over data warehousing when it comes to scaling quality, contextualized data for machine learning and advanced analytics purposes.
The age of artificial intelligence, deep dives on machine learning and deep learning. Machine perception and applications. How company use AI in their businesses. Case study: Netflix. Basic tools for data manipulation and data visualization.
Knowledge Sharing over social networking systemstanguy
1. The document discusses knowledge sharing over social networking systems and analyzes data from the social networking site Ecademy.
2. The Ecademy data showed a power law distribution structure typical of social networks and small world properties with short paths between users.
3. A survey of Ecademy users found that face-to-face relationships positively influenced relationship strength and knowledge sharing, though the site mainly facilitated weak relationships.
Linked Open Data and data-driven journalismPia Jøsendal
A keynote held at the Media 3.0 seminar in Bergen. It is an introductionary presentation of simple key elements of linked open data. It adresses media and journalists, what data driven journalism can look like and why they should care about what linked open data can offer.
Applied AI Workshop - Presentation - Connect Day GDLMarc Teunis
An API, or Application Programming Interface, is a software intermediary that allows two applications to talk to each other. Essentially, an API provides a common language and set of rules for applications to communicate with each other.
Some key things to know about APIs:
- APIs allow different systems and applications to communicate with each other and share data without having to know the internal workings of each other. This makes it easier for applications to work together.
- APIs are used extensively on the internet, allowing websites, apps, and services to share data and functionality with each other. Common examples include mapping APIs, payment APIs, and social media APIs.
- APIs have an interface (a set of rules and specifications) that define
This document outlines the process of data-informed decision making. It discusses acquiring relevant data from internal and external sources, analyzing the data to find patterns and relationships, applying personal expertise when interpreting the data, announcing and implementing decisions to stakeholders, and assessing the outcomes of decisions to continually improve the process. The goal is to formulate questions, leverage different types of data analysis, make evidence-based decisions, and monitor their impacts over time.
Online Data Preprocessing: A Case Study ApproachIJECEIAES
Besides the Internet search facility and e-mails, social networking is now one of the three best uses of the Internet. A tremendous number of volunteers every day write articles, share photos, videos and links at a scope and scale never imagined before. However, because social network data are huge and come from heterogeneous sources, the data are highly susceptible to inconsistency, redundancy, noise, and loss. For data scientists, preparing the data and getting it into a standard format is critical because the quality of data is going to directly affect the performance of mining algorithms that are going to be applied next. Low-quality data will certainly limit the analysis and lower the quality of mining results. To this end, the goal of this study is to provide an overview of the different phases involved in data preprocessing, with a focus on social network data. As a case study, we will show how we applied preprocessing to the data that we collected for the Malaysian Flight MH370 that disappeared in 2014.
This document provides an overview of data mining and data aggregation basics. It discusses key concepts such as the phases of the data mining process according to the CRISP-DM framework which includes business understanding, data understanding, data preparation, modeling, evaluation, and deployment. It also discusses different types of data aggregation including time and spatial aggregation and summarization techniques such as calculating the mean, count, maximum, minimum, mode, range, and sum. Additionally, it presents different ways of visualizing data including tables, bar charts, histograms, pie charts, and line graphs.
The Design of an Online Social Network Site for Emergency Management: A One-S...guest636475b
Web 2.0 is creating new opportunities for communication and collaboration. Part of this explosion is the increase in popularity and use of Social Network Sites (SNSs) for general and domain-specific use. In the emergency domain there are a number of websites, wikis, SNSs, etc. but they stand as silos in the field, unable to allow for cross-site collaboration. In this paper we describe ongoing design science research to develop and refine guiding principles for developing an SNS that will bring together emergency domain professionals in a “one-stop-shop.” We surveyed emergency professionals who study crisis information systems, to ascertain potential functionalities of such an SNS. Preliminary results suggest that there is a need for the envisioned SNS. Future research will continue to explore possible solutions to issues addressed in this paper.
The Potential and Challenges of Today's AIBohyun Kim
A preconference presentation given by Bohyun Kim, CTO and Associate Professor, University of Rhode Island Libraries, at the NISO Plus Conference at Baltimore, MD on February 23, 2020.
Einstein published his ideas and became a pivotal element in shifting the way we think about physics - from the Newtonian model to the Quantum - in turn this changed the way we think about the world and allowed us to develop new ways of engaging with the world.
We are at a similar juncture. The development of computational technologies allows us to think about astronomical volumes of data and to make meaning of that data.
The mindshift that occurs is that “the machine is our friend”. The computer, like all machines, extends our capabilities. As a consequence the types of thinking now required in industry are those that get away from thinking like a computer and shift towards creative engagement with possibilities. Logical thinking is still necessary but it starts to be driven by imagination.
Computational thinking and data science change the way we think about defining and solving problems.
The age of creativity - which increasingly extends its impact from arts applications to business, scientific, technological, entrepreneurship, political, and other contexts.
IRE "Better Watchdog" workshop presentation "Data: Now I've got it, what do I...J T "Tom" Johnson
The document discusses analyzing data that has been collected through investigative journalism projects. It provides tips on storing data in the cloud and bookmarking tools, challenges in analyzing poorly formatted government data from New Mexico's transparency portal, and strategies for analyzing both qualitative and quantitative data through tools like spreadsheets, databases, and data visualization programs. The goal is to turn collected data into useful information that can be shared through stories.
Research in Intelligent Systems and Data Science at the Knowledge Media Insti...Enrico Motta
The document discusses research directions in intelligent systems and data science. It describes work on making sense of scholarly data through techniques like data mining, semantic technologies, and machine learning. It also discusses mapping and classifying computer science research areas using an automatically generated ontology with over 14,000 topics. Other topics discussed include predicting emerging research areas, applications in smart cities like the MK:Smart project, and potential roles for robots in smart cities like an autonomous health and safety inspector.
WEB EVOLUTION - THE SHIFT FROM INFORMATION PUBLISHING TO REASONINGijaia
The Web, as communication channel, has had variety of development that allows information to be published and accessed in a scaleable approach. With the revolution of the information, some research studies have conducted to boost the present situation and propose advance version of the Web. Therefore, it is important to look into the new version of the Web in order to improve the way that information is expressed, to make more intelligent choices and to obtain a better meaning of the information over the Web. That is, future web would require specific architecture in order to support the extracting of better
meaning or "reasoning". With Web 1.0 and Web 2.0, the current information over the Web is not understandable for the machines. Understanding is big shift for wide open door for innovatoion and reasoning. In this work, we research the progress of the Web from Web 1.0, Web 2.0, Web 3.0, Web 4.0, to Web 5.0. We are pointing out document types and technologies employed to understand the changes from
Web 1.0 to Web 3.0 and to predicate the future of the Web (Web 4.0 and Web 5.0). Also, we present the current status and concerns about the Web as an information source and communication channel.
Visual communication of quantitative data (v. 2020 ITA)Frieda Brioschi
Quantitative and qualitative data recap. Visual systems and preattentive attributes. Quantitative data visualization, chart selector. Some useful tactics.
The document discusses how social software and knowledge networks are changing how knowledge is produced, distributed, and validated. It introduces concepts like downloadable beliefs, where people can search online networks to find and adopt beliefs, and folksonomies, where common users collaboratively categorize information. The future of knowledge organization is envisioned to involve more social and semantic approaches like collaborative tagging and the semantic web, which aims to enable computers to understand the meaning and relationships of online information.
The document discusses several topics related to search engines and online information, including:
1) The PageRank algorithm and its extensions over time to provide more contextually relevant search results.
2) Concerns about privacy and concentration of power as collective intelligence and user data is concentrated within large tech companies.
3) Differences in search results between engines and regions due to factors like censorship and localized information.
This article proposes a new e-portfolio system called PrPl that addresses key problems with current systems. PrPl leverages semantic indexing and cloud computing to allow users to manage their own data across applications in a sustainable way. It uses an "intelligent agent" or "butler" to track, verify, and intelligently present a user's digital artifacts stored across the cloud based on semantics. The authors argue this system could provide a customizable, responsive, and intelligent e-portfolio solution for lifelong learning. However, the article does not fully address issues of privacy, security, and cultural implications of relying on cloud infrastructure.
The document discusses the history and evolution of the Internet of Things (IoT) from its origins in the 1990s to its current applications and future potential. It explores key concepts in IoT like local sensing, data integration, and analytics of connected devices. Examples are provided of how IoT is being implemented in various sectors such as education, healthcare, retail, agriculture, and smart cities. The economic and social benefits of IoT adoption are also highlighted.
Responsive design, application development using APIs, and content strategy are hot topics in web development right now. These ideas belong to a bigger umbrella: ubiquitous computing and the role it plays in our lives. Traditional ideas of usability are undergoing dynamic changes as we move away from a desktop-first model of personal computing.
The internet refrigerator already exists and it's only the tip of the iceberg. In the near future, human-computer interactions will be thoroughly integrated into everyday objects and activities.
Postdesktop was a presentation to add clarity to responsive design as part of a larger context and to think about a shift that is changing the devices we use to access the web, the delivery method for education, the teaching and learning experience, and the whole of our lives.
Topics included a look at the role of pervasive computing:
• as it relates to responsive design
• in the classroom and textbooks
• in .edu marketing and utility on campuses
Written by Doug Gapinski and first delivered at PSU Web Conference 2012
FAIR data_ Superior data visibility and reuse without warehousing.pdfAlan Morrison
The advantages of semantic knowledge graphs over data warehousing when it comes to scaling quality, contextualized data for machine learning and advanced analytics purposes.
The age of artificial intelligence, deep dives on machine learning and deep learning. Machine perception and applications. How company use AI in their businesses. Case study: Netflix. Basic tools for data manipulation and data visualization.
Knowledge Sharing over social networking systemstanguy
1. The document discusses knowledge sharing over social networking systems and analyzes data from the social networking site Ecademy.
2. The Ecademy data showed a power law distribution structure typical of social networks and small world properties with short paths between users.
3. A survey of Ecademy users found that face-to-face relationships positively influenced relationship strength and knowledge sharing, though the site mainly facilitated weak relationships.
Linked Open Data and data-driven journalismPia Jøsendal
A keynote held at the Media 3.0 seminar in Bergen. It is an introductionary presentation of simple key elements of linked open data. It adresses media and journalists, what data driven journalism can look like and why they should care about what linked open data can offer.
Applied AI Workshop - Presentation - Connect Day GDLMarc Teunis
An API, or Application Programming Interface, is a software intermediary that allows two applications to talk to each other. Essentially, an API provides a common language and set of rules for applications to communicate with each other.
Some key things to know about APIs:
- APIs allow different systems and applications to communicate with each other and share data without having to know the internal workings of each other. This makes it easier for applications to work together.
- APIs are used extensively on the internet, allowing websites, apps, and services to share data and functionality with each other. Common examples include mapping APIs, payment APIs, and social media APIs.
- APIs have an interface (a set of rules and specifications) that define
This document provides an overview of an ICT (Information and Communications Technology) unit. By the end of the unit, students should be able to: 1) use ICT terms properly and discuss responsible cyber citizenship, 2) compare and contrast online platforms to achieve objectives, and 3) use search engines effectively to improve research skills. The unit covers topics like ICT definitions, online systems/ethics, and search/research skills through 4 lessons. It also provides background information on concepts like information, communication, technology, computers, the internet, and the world wide web. Ethics and safety are discussed, including plagiarism, cyberbullying, and maintaining privacy.
Humans in the loop: AI in open source and industryPaco Nathan
Nike Tech Talk, Portland, 2017-08-10
https://niketechtalks-aug2017.splashthat.com/
O'Reilly Media gets to see the forefront of trends in artificial intelligence: what the leading teams are working on, which use cases are getting the most traction, previews of advances before they get announced on stage. Through conferences, publishing, and training programs, we've been assembling resources for anyone who wants to learn. An excellent recent example: Generative Adversarial Networks for Beginners, by Jon Bruner.
This talk covers current trends in AI, industry use cases, and recent highlights from the AI Conf series presented by O'Reilly and Intel, plus related materials from Safari learning platform, Strata Data, Data Show, and the upcoming JupyterCon.
Along with reporting, we're leveraging AI in Media. This talk dives into O'Reilly uses of deep learning -- combined with ontology, graph algorithms, probabilistic data structures, and even some evolutionary software -- to help editors and customers alike accomplish more of what they need to do.
In particular, we'll show two open source projects in Python from O'Reilly's AI team:
• pytextrank built atop spaCy, NetworkX, datasketch, providing graph algorithms for advanced NLP and text analytics
• nbtransom leveraging Project Jupyter for a human-in-the-loop design pattern approach to AI work: people and machines collaborating on content annotation
The document discusses principles for designing reusable learning objects and human-computer interaction. It describes learning objects as small instructional components that can be reused, describing programming languages like Scratch and Squeak that allow creating them. It also discusses universal design principles for education, ensuring representation, expression and engagement for all learners.
Confirming PagesLess managing. More teaching. Greater AlleneMcclendon878
Confirming Pages
Less managing. More teaching. Greater learning.
INSTRUCTORS GET:
• Interactive Applications – book-specific interactive
assignments that require students to APPLY what
they’ve learned.
• Simple assignment management, allowing you to
spend more time teaching.
• Auto-graded assignments, quizzes, and tests.
• Detailed Visual Reporting where student and
section results can be viewed and analyzed.
• Sophisticated online testing capability.
• A filtering and reporting function
that allows you to easily assign and
report on materials that are correlated
to accreditation standards, learning
outcomes, and Bloom’s taxonomy.
• An easy-to-use lecture capture tool.
Would you like your students to show up for class more prepared? (Let’s face it, class
is much more fun if everyone is engaged and prepared…)
Want ready-made application-level interactive assignments, student progress
reporting, and auto-assignment grading? (Less time grading means more time teaching…)
Want an instant view of student or class performance relative to learning
objectives? (No more wondering if students understand…)
Need to collect data and generate reports required for administration or
accreditation? (Say goodbye to manually tracking student learning outcomes…)
Want to record and post your lectures for students to view online?
INSTRUCTORS...
With McGraw-Hill's Connect® MIS,
haa7685X_fm_i-xxxv.indd ihaa7685X_fm_i-xxxv.indd i 12/20/11 9:29 PM12/20/11 9:29 PM
Confirming Pages
Want an online, searchable version of your textbook?
Wish you could reference your textbook online while you’re doing
your assignments?
Want to get more value from your textbook purchase?
Think learning MIS should be a bit more interesting?
Connect® Plus MIS eBook
If you choose to use Connect™ Plus MIS, you have an affordable and
searchable online version of your book integrated with your other
online tools.
Connect® Plus MIS eBook offers features like:
• Topic search
• Direct links from assignments
• Adjustable text size
• Jump to page number
• Print by section
Check out the STUDENT RESOURCES
section under the Connect® Library tab.
Here you’ll find a wealth of resources designed to help you
achieve your goals in the course. You’ll find things like quizzes,
PowerPoints, and Internet activities to help you study.
Every student has different needs, so explore the STUDENT
RESOURCES to find the materials best suited to you.
haa7685X_fm_i-xxxv.indd iihaa7685X_fm_i-xxxv.indd ii 12/20/11 9:29 PM12/20/11 9:29 PM
Confirming Pages
Management Information Systems
FOR THE INFORMATION AGE
NINTH EDITION
Stephen Haag
DANIELS COLLEGE OF BUSINESS
UNIVERSITY OF DENVER
Maeve Cummings
KELCE COLLEGE OF BUSINESS
PITTSBURG STATE UNIVERSITY
haa7685X_fm_i-xxxv.indd iiihaa7685X_fm_i-xxxv.indd iii 12/26/11 5:37 PM12/26/11 5:37 PM
Confirming Pages
MANAGEMENT INFORMATION SYSTEMS FOR THE INF ...
The document describes the Sixth Sense technology created by Pranav Mistry and students at MIT. It allows users to access the internet and interact with digital information by using hand gestures and physical surfaces as touchscreens. Key features include retrieving information about products by pointing a camera at them, getting flight status from boarding passes, and accessing encyclopedia entries from printed materials. The document outlines potential educational uses for student engagement, research, collaboration and cost-effectiveness. It also discusses advantages like portability, affordability, and connecting physical and digital worlds.
Deep Learning Explained: The future of Artificial Intelligence and Smart Netw...Melanie Swan
This talk provides an overview of an important emerging artificial intelligence technology, deep learning neural networks. Deep learning is a branch of computer science focused on machine learning algorithms that model and make predictions about data. A key distinction is that deep learning is not merely a software program, but a new class of information technology that is changing the concept of the modern technology project by replacing hard-coded software with a capacity to learn and execute tasks. In the future, deep learning smart networks might comprise a global computational infrastructure tackling real-time data science problems such as global health monitoring, energy storage and transmission, and financial risk assessment.
e-Research and the Demise of the Scholarly ArticleDavid De Roure
Innovations 2013 - e-Science, we-Science and the latest evolutions in e-publishing. STM International Association of Scientific, Technical & Medical Publishers. 4th December 2013, Congress Centre, Great Russell Street, London, UK.
Presentation slide used during the meetup on Artificial Intelligence and Its Ecosystem organized by Developer Session. In the presentation, I highlighted why open data is one of the key parts of AI ecosystem and the situation of Open Data in Nepal.
Computing is fundamental to all instructional technologies. VT should ensure students, faculty and staff are proficient in computational thinking and data-driven decision making. Ongoing research in areas like learning science and ubiquitous computing will lay the foundations for future educational practices. Digital libraries can transform learning by providing personalized educational resources and services through integrated virtual learning environments and educational metadata standards.
Robust Expert Finding in Web-Based Community Information SystemsRalf Klamma
Robust Expert Finding in Web-Based Community Information Systems
Ralf Klamma
Advanced Community Information Systems (ACIS)RWTH Aachen University, Germany
Similaire à Digital communication (v. 2021 ITA) (20)
Storytelling fundamentals (from Propp to Andrea Fontana) and examples. Marketing perspectives on storytelling. Storytelling with data techniques. Hints and examples
Visual communication of qualitative and quantitative data (v. 2021 ITA)Frieda Brioschi
Visual systems and preattentive attributes. Quantitative data visualization, chart selector. Some useful tactics. Qualitative data definition and examples. Qualitative metaphors. Data visualization & journalism. Common kinds: mind maps, flow diagrams, words cloud, user journey, tube map, maps. Qualitative chart chooser.
Survivorship bias applied to information. Cognition, how we learn, sensation and perception, experience. Human sight and visual perception, visual memory. Gestalt principles. Machine perception.
Introduction to data classification. Back to origins: history of libraries and their classification methods. Some examples of classification in different areas.
How to collect and organize data (v. ITA 2021)Frieda Brioschi
Overview on data collection methods and a deep dive on data (primary Vs secondary, qualitative and quantitative). Bias. Data processing and structured, unstructured, semistructured data. Example of personal data tracking.
Storytelling fundamentals (from Propp to Andrea Fontana) and examples. Marketing perspectives on storytelling. Storytelling with data techniques. Hints and examples
The document discusses visual perception and how information is processed. It covers topics like cognition, cognitive science, learning styles, sensation and perception, visual memory, gestalt principles, and limits of short-term memory. Examples are provided to illustrate concepts like preattentive attributes, chunking, and how visualization can take advantage of human perception to effectively communicate data and patterns.
This document provides an overview of data organization and classification in libraries and other domains. It begins by discussing the history of libraries and early classification systems used in ancient Mesopotamia and China. It then covers modern library classification standards like the Dewey Decimal System and subject-based organization. The document also examines classification of natural phenomena like volcanoes, stars, satellites, and languages. It concludes by discussing classification of administrative divisions and examples of categorizing a country's average life expectancy data.
Storytelling fundamentals (from Propp to Andrea Fontana) and examples. Marketing perspectives on storytelling. Storytelling with data techniques. Hints and examples
Quantitative and qualitative data recap. Visual systems and preattentive attributes. Quantitative data visualization, chart selector. Some useful tactics.
The document provides information about how humans perceive and process information. It discusses topics like cognition, cognitive science, learning styles, sensation and perception, visual perception, memory, Gestalt principles, and machine perception. For example, it explains that cognition encompasses many intellectual functions and processes like attention, memory, reasoning, and decision making. It also outlines several Gestalt principles of visual organization like similarity, proximity, and closure.
Introduction to data classification. Back to origins: history of libraries and their classification methods. Some examples of classification in different areas.
Communication epic fails, why crisis management matters.
We analyze the current landscape, starting from Cluetrain Manifesto, through some definitions (Social media vs industrial media, social networks, networked publics).
How we can create an effective message: personalization, groups, behaviours, communities, immediacy, perfect timing, different techniques and styles.
Then some essential rules, regarding listen and conversation, the blur between public and private, storytelling, goals and how
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
How to Add Chatter in the odoo 17 ERP ModuleCeline George
In Odoo, the chatter is like a chat tool that helps you work together on records. You can leave notes and track things, making it easier to talk with your team and partners. Inside chatter, all communication history, activity, and changes will be displayed.
How to Build a Module in Odoo 17 Using the Scaffold MethodCeline George
Odoo provides an option for creating a module by using a single line command. By using this command the user can make a whole structure of a module. It is very easy for a beginner to make a module. There is no need to make each file manually. This slide will show how to create a module using the scaffold method.
A review of the growth of the Israel Genealogy Research Association Database Collection for the last 12 months. Our collection is now passed the 3 million mark and still growing. See which archives have contributed the most. See the different types of records we have, and which years have had records added. You can also see what we have for the future.
it describes the bony anatomy including the femoral head , acetabulum, labrum . also discusses the capsule , ligaments . muscle that act on the hip joint and the range of motion are outlined. factors affecting hip joint stability and weight transmission through the joint are summarized.
Assessment and Planning in Educational technology.pptxKavitha Krishnan
In an education system, it is understood that assessment is only for the students, but on the other hand, the Assessment of teachers is also an important aspect of the education system that ensures teachers are providing high-quality instruction to students. The assessment process can be used to provide feedback and support for professional development, to inform decisions about teacher retention or promotion, or to evaluate teacher effectiveness for accountability purposes.
How to Manage Your Lost Opportunities in Odoo 17 CRMCeline George
Odoo 17 CRM allows us to track why we lose sales opportunities with "Lost Reasons." This helps analyze our sales process and identify areas for improvement. Here's how to configure lost reasons in Odoo 17 CRM
How to Manage Your Lost Opportunities in Odoo 17 CRM
Digital communication (v. 2021 ITA)
1. Frieda Brioschi - frieda.brioschi@gmail.com
Emma Tracanella - emma.tracanella@gmail.com
DIGITAL COMMUNICATION
LESSON 9 - 2021
2. LESSON 11
THE COURSE
1. Introduction. What are data and information, why they matter
2. How to collect and organize data
3. Information classification
4. Data lingo
5. Around Data Science
6. How we perceive information
7. Visual communication of quantitative data
8. Visual communication of qualitative data
9. Storytelling with data
10.Effective content
11.AI demythologized
12.Tools for analysis and data visualization
2
7. A POWERFUL GLOBAL CONVERSATION HAS BEGUN. THROUGH
THE INTERNET, PEOPLE ARE DISCOVERING AND INVENTING NEW
WAYS TO SHARE RELEVANT KNOWLEDGE WITH BLINDING SPEED.
AS A DIRECT RESULT, MARKETS ARE GETTING SMARTER—AND
GETTING SMARTER FASTER THAN MOST COMPANIES
Cluetrain Manifesto, 1999
LESSON 9
7
8. LESSON 9
SOCIAL MEDIA
Social media are online technologies and practices that people use to share text, image, video
and audio.
Andreas Kaplan and Michael Haenlein define social media as "a group of Internet-based
applications that build on the ideological and technological foundations of Web 2.0, and that
allow the creation and exchange of user-generated content.”
They represented a change in how people learn, read and share information and contents: a
blend between sociology and technology takes place and it tranforms a monologue (1-to-many)
into a dialogue (many-to-many) and information result democratized, transforming persons from
users to editors.
▸ https://web.archive.org/web/20111124233421/michaelhaenlein.eu/Publications/Kaplan,%20Andreas%20-%20Users%20of%20the%20world,
%20unite.pdf
8
9. LESSON 9
SOCIAL NETWORKS
9
Danah Boyd and Nicole Ellison define a social network as a web-based services that
allow individuals to:
▸ construct a public or semi-public profile within a bounded system,
▸ articulate a list of other users with whom they share a connection, and
▸ view and traverse their list of connections and those made by others within the
system.
The nature and nomenclature of these connections may vary from site to site.
▸ http://jcmc.indiana.edu/vol13/issue1/boyd.ellison.html
10. LESSON 9
NETWORKED PUBLICS
10
According to Danah Boyd, social network sites can be understood as networked
publics which are simultaneously:
▸ the space constructed through networked technologies and
▸ the imagined community that emerges as a result of the intersection of
people, technology, and practice
▸ http://www.danah.org/papers/TakenOutOfContext.pdf
11. LESSON 9
PROPERTIES & DINAMICS
11
Four properties:
1. persistence
2. searchability
3. replicability
4. scalability
▸ http://www.danah.org/papers/TakenOutOfContext.pdf
Three dynamics:
1. invisible audiences
2. collapsed contexts
3. the blurring of public and private
23. LESSON 10
CONTENT STRATEGY
23
What do you want to communicate?
3 steps:
1. what
2. where
3. how
https://www.techsoup.org/support/articles-and-how-tos/5-steps-to-an-effective-content-strategy-for-your-nonprofit
24. GOOD CONTENT IS
WRITTEN FOR A
SPECIFIC AUDIENCE
https://medium.com/strategic-content-marketing/the-7-signs-of-truly-effective-content-270b50b8c6ae
25. IT’S OPTIMIZED FOR
SEARCH AND SOCIAL,
TOO.
https://medium.com/strategic-content-marketing/the-7-signs-of-truly-effective-content-270b50b8c6ae
29. TO BE MOST EFFECTIVE,
CONTENT MUST ALSO
BE STRUCTURED WELL.
https://medium.com/strategic-content-marketing/the-7-signs-of-truly-effective-content-270b50b8c6ae
30. GOOD CONTENT SHOULD
BE PROPERLY
DISTRIBUTED AND
PROMOTED.
https://medium.com/strategic-content-marketing/the-7-signs-of-truly-effective-content-270b50b8c6ae
31. LESSON 9
MOST COMMON CONTENT TYPES
31
1. Blog Posts
2. Long Form Articles
3. Original Research
4. Video
5. Infographics
6. Images
7. Case Studies
8. White Papers/Reports
9. Ebooks
10.Presentations
11.Webinars
12.Quizzes and Polls
13.Podcasts
14.Checklists
15.Email Newsletters
41. AI IS THE PART OF COMPUTER SCIENCE WHICH STUDIES THEORIES,
METHODOLOGIES AND TECHNIQUES THAT ENABLE TO DESIGN
HARDWARE AND SOFTWARE SYSTEMS THAT GIVE TO COMPUTERS
SOME SKILL WHOSE SCOPE WOULD SEEM, TO A COMMON
OBSERVER, TO BE EXCLUSIVELY POSSIBLE TO HUMAN INTELLIGENCE.
Marco Somalvico
LESSON 9
41
42. LESSON 9
FOUR APPROACHES
The field of AI:
1. Thinking humanly: cognitive modeling. Systems should solve problems the
same way humans do.
2. Thinking rationally: the use of logic. Need to worry about modeling uncertainty
and dealing with complexity.
3. Acting humanly: the Turing Test approach.
4. Acting rationally: the study of rational agents: agents that maximize the
expected value of their performance measure given what they currently know.
42
https://cs.lmu.edu/~ray/notes/introai/
51. data & content design
CLASSIFYING AI
51
▸ Machine Learning
▸ Deep Learning
▸ Reinforcement Learning: the use of rewarding systems that achieve objectives in order
to strengthen (or weaken) specific outcomes. This is frequently used with agent systems.
▸ Agent Systems: systems in which autonomous agents interact within a given
environment in order to simulate emergent or crowd based behavior. Used more and
more frequently with games in particular, but is also used with other forms of
simulations.
▸ https://www.forbes.com/sites/cognitiveworld/2019/08/20/what-is-artificial-intelligence/#66ebdad9306f
LESSON 9
52. data & content design
CLASSIFYING AI
52
▸ Non-Linear Grid Systems: a variation of agented systems in which cells in n-
dimensional grids maintain internal state but also receive stimulae from
adjacent cells and generate output to those cells. The distant ancestor of
most of these is Conway's Game of Life, but the idea is used to a much
higher degree of complexity with most weather and stock modeling systems
that are fundamentally recursive.
▸ Self-Modifying Graph Systems: these include knowledge bases and so forth
in which the state of the system changes due to system contingent
heuristics.
LESSON 9
53. data & content design
CLASSIFYING AI
53
▸ Knowledge Bases, Business Intelligence Systems and Expert Systems: these often form a
spectrum from traditional data systems to aggregate semantic knowledge graphs. To a
certain extent they are human curated, but some of this curation is increasingly switching
over to machine learning for both classification, categorization and abstraction.
▸ Chatbots and Intelligent Agents: this differs from agent systems. Agents in general are
computer systems that are able to parse written or spoken text, use it to retrieve certain
content or perform certain actions, and the respond using appropriately constructed
content. The earliest such system, Eliza, dates back to the mid-1960s, but was very
primitive. Today's agents and chatbots, on the other hand, use a combination of
semantics, Bayesian analysis and machine learning to both build up the appropriate
information and learn about the user.
LESSON 9
54. data & content design
CLASSIFYING AI
54
▸ Visual/Audio Recognition Systems: in most cases V/A systems work by converting the
media in question to an encoded compressed form, then algorithms look via either
indexes or machine learning systems for the closest matches. This is often enhanced
with Bayesian Analysis, where specific patterns are analysed based upon their
frequency of occurrence relative to one another, and are also often tied in with
semantic systems that provide relationship information.
▸ Fractal Visualization: the connection between fractals and AI runs deep, and not
surprisingly one of the biggest areas for AI is in the development of parameterized
natural rendering - the movement of water, the roar of fire, the coarseness of rock, the
effects of smoke in the air, all of which have become standard fare in big Hollywood
blockbusters.
LESSON 9
55. data & content design
NOT PROPERLY AI
55
▸ Autonomous Vehicles: these make use of visual recognition systems and real time modeling in
order to both anticipate obstacles (static and moving) and to determine actions based upon
objectives.
▸ Drones: a drone is an autonomous vehicle without a passenger, and can be as small as a
dragonfly or as large as a jet. Drones can also act in a coordinated fashion, either by following
swarm behavior (an agent system) or by following preprogrammed instructions.
▸ Data Science / Data Analytics: this is the use of data to identify patterns or predict behavior.
This uses a combination of machine learning techniques and numeric statistical analysis, along
with an increasingly large roll for non-linear differential equations. The primary distinction is
that most data scientist does not make heavy use of higher order functions or recursion,
though again, this is changing.
LESSON 9
56. data & content design
NOT PROPERLY AI
56
▸ Blockchain and Distributed Ledgers: distributed ledger technology underlies electronic coinage, but
it is also playing a bigger and bigger role in tracking resources and transactions. One aspect of such
systems is that they make it possible to bind virtual objects as if they were unique physical objects, in
effect making intellectual property exchangeable. This has application throughout the AI space,
especially in the realm of agented systems, even if it is not AI per se.i
▸ Internet of Things / Robotics: internet of things is intended to provide network connectivity to devices
so that they can communicate with other devices. Robotics involves creating autonomous physical
agents capable of movement. In that both of these may end up managing their own state, relies upon
AI-based systems for identifying signals and determining response, they use AI, but aren't directly AI.
▸ GPUs: the Central Processing Unit is so last century. Artificial intelligence is taking advantage of Graph
Processing Units in a big way, as their structure makes them ideal for both semantic analysis and
recursive filter applications.
LESSON 9
58. data & content design
ARTIFICIAL INTELLIGENCE
58
On a lower level, an AI can be only a programmed rule that determines the
machine to behave in a certain way in certain situations.
So basically Artificial Intelligence can be nothing more than just a bunch of if-
else statements. An if-else statement is a simple rule explicitly programmed by a
human.
LESSON 9
59. data & content design
59https://www.bbc.com/news/technology-50779761
LESSON 9
60. data & content design
MACHINE LEARNING
60
Algorithms that analyze data, learn from it and make informed decisions based on
the learned insights.
Machine Learning algorithms must be trained on data. The more data you provide
to your algorithm, the better it gets.
The “training” part of a Machine Learning model means that this model tries to
optimize along a certain dimension. The Machine Learning models try to minimize the
error between their predictions and the actual ground truth values.
In short machine learning models are optimization algorithms. If you tune them right,
they minimize their error by guessing and guessing and guessing again.
LESSON 9
61. data & content design
DEEP LEARNING
61
Deep Learning uses a multi-layered structure of algorithms called the neural
network.
LESSON 9
62. data & content design
FEATURE EXTRACTION
62
The models of deep learning require little to no manual effort to perform and
optimize the feature extraction process.
LESSON 9
63. data & content design
EXAMPLE
63
If you want to use a machine learning model to determine whether a particular
image shows a car or not, we humans first need to identify the unique features of
a car (shape, size, windows, wheels, etc.), extract these features and give them to
the algorithm as input data. This way, the machine learning algorithm would
perform a classification of the image. That is, in machine learning, a programmer
must intervene directly in the classification process.
In the case of a deep learning model, the feature extraction step is completely
unnecessary. The model would recognize these unique characteristics of a car
and make correct predictions- completely without the help of a human.
LESSON 9
64. data & content design
AND BIG DATA
64
Deep Learning models tend to increase their accuracy with the increasing
amount of training data, where’s traditional machine learning models stop
improving after a saturation point.
LESSON 9
66. data & content design
MACHINE PERCEPTION
Machine perception is the capability of a computer system to interpret data in a manner that
is similar to the way humans use their senses to relate to the world around them.
The basic method that the computers take in and respond to their environment is through the
attached hardware. Until recently input was limited to a keyboard, or a mouse, but advances
in technology, both in hardware and software, have allowed computers to take in sensory
input in a way similar to humans.
Machine perception allows the computer to use this sensory input, as well as conventional
computational means of gathering information, to gather information with greater accuracy
and to present it in a way that is more comfortable for the user.
These include computer vision, machine hearing, and machine touch.
66
LESSON 9
67. data & content design
FACIAL RECOGNITION SYSTEM
A facial recognition system is a technology capable of identifying or verifying a
person from a digital image or a video frame from a video source.
There are multiple methods in which facial recognition systems work, but in general,
they work by comparing selected facial features from given image with faces within
a database.
Although the accuracy of facial recognition system as a biometric technology is
lower than iris recognition and fingerprint recognition, it is widely adopted due to its
contactless and non-invasive process.
67
LESSON 9
68. data & content design
THIS RECOGNITION PROBLEM IS MADE DIFFICULT BY THE GREAT VARIABILITY IN HEAD
ROTATION AND TILT, LIGHTING INTENSITY AND ANGLE, FACIAL EXPRESSION, AGING, ETC.
(…)
YET THE METHOD OF CORRELATION (OR PATTERN MATCHING) OF UNPROCESSED OPTICAL
DATA, WHICH IS OFTEN USED BY SOME RESEARCHERS, IS CERTAIN TO FAIL IN CASES
WHERE THE VARIABILITY IS GREAT. IN PARTICULAR, THE CORRELATION IS VERY LOW
BETWEEN TWO PICTURES OF THE SAME PERSON WITH TWO DIFFERENT HEAD ROTATIONS.
Woody Bledsoe, 1966
68
LESSON 9
69. data & content design
DEEPFACE
DeepFace is a deep learning facial recognition system created by a research group
at Facebook.
It identifies human faces in digital images. It employs a nine-layer neural net with over
120 million connection weights, and was trained on four million images uploaded by
Facebook users.
The system is said to be 97% accurate, compared to 85% for the FBI's Next
Generation Identification system.
69
LESSON 9
71. data & content design
APPLE FACE ID
Apple introduced Face ID on the flagship iPhone X as a biometric authentication system.
Face ID has a facial recognition sensor: "Romeo" the module that projects more than 30,000 infrared
dots onto the user's face, and "Juliet" the module that reads the pattern. The pattern is sent to a local
"Secure Enclave" in the device's CPU to confirm a match with the phone owner's face.
The system will not work with eyes closed, in an effort to prevent unauthorized access.
The technology learns from changes in a user's appearance, and therefore works with hats, scarves,
glasses, and many sunglasses, beard and makeup.
It also works in the dark. This is done by using a "Flood Illuminator", which is a dedicated infrared
flash that throws out invisible infrared light onto the user's face to properly read the 30,000 facial
points.[34]
71
https://en.wikipedia.org/wiki/Facial_recognition_system
LESSON 9
72. data & content design
CHINESE AIRPORTS
72
As of late 2017, China has deployed
facial recognition and artificial
intelligence technology in Xinjiang.
Reporters visiting the region found
surveillance cameras installed every
hundred meters or so in several cities, as
well as facial recognition checkpoints at
areas like gas stations, shopping centers,
and mosque entrances.
https://www.youtube.com/watch?v=wcM5-E4Kze4
LESSON 9
73. data & content design
ANTI-FACIAL RECOGNITION SYSTEMS
73
LESSON 9
74. data & content design
ENOVIA SMART ROBOTS
74
Smart Robots has developed a
universal device that enables the
integration of cobots with human
activities.
The device can map the workspace in
real time, to recognize objects, to
command the robot to interact with
users and adapt to them, and to self-
learn new commands through
gestures.
LESSON 9
75. data & content design
277 PEOPLE IN 177 CARS
75
https://imgur.com/gallery/sCvRIEd
LESSON 9
77. data & content design
AI EXAMPLES
▸ Smart assistants (like Siri and Alexa)
▸ Disease mapping and prediction tools
▸ Manufacturing and drone robots
▸ Optimized, personalized healthcare treatment recommendations
▸ Conversational bots for marketing and customer service
▸ Robo-advisors for stock trading
▸ Spam filters on email
▸ Social media monitoring tools for dangerous content or false news
▸ Song or TV show recommendations from Spotify and Netflix
77
https://builtin.com/artificial-intelligence
LESSON 9
78. data & content design
ALIBABA
Chinese company Alibaba is the world's largest e-commerce platform that sells more than
Amazon and eBay combined.
AI is integral in Alibaba’s daily operations and is used to predict what customers might want to
buy. With natural language processing, the company automatically generates product
descriptions for the site.
Another way Alibaba uses artificial intelligence is in its City Brain project to create smart cities.
The project uses AI algorithms to help reduce traffic jams by monitoring every vehicle in the
city.
Additionally, Alibaba, through its cloud computing division called Alibaba Cloud, is
helping farmers monitor crops to improve yield and cuts costs with artificial intelligence.
78
LESSON 9
79. data & content design
ALPHABET – GOOGLE
Waymo, the company’s self-driving technology division, wants to bring self-driving
technology to the world to not only to move people around, but to reduce the number of
crashes. Its autonomous vehicles are currently shuttling riders around California in self-
driving taxis. Right now, the company can’t charge a fare and a human driver still sits
behind the wheel during the pilot programme.
Google acquired DeepMind. Not only did the system learn how to play 49 different Atari
games, the AlphaGo programme was the first to beat a professional player at the game of
Go.
Google Duplex - Using natural language processing, an AI voice interface can make
phone calls and schedule appointments on your behalf.
79
LESSON 9
80. data & content design
AMAZON
Not only is Amazon in the artificial intelligence game with its digital voice assistant, Alexa, but
artificial intelligence is also part of many aspects of its business.
Another innovative way Amazon uses artificial intelligence is to ship things to you before you even
think about buying it. They collect a lot of data about each person’s buying habits and have such
confidence in how the data they collect helps them recommend items to its customers and now
predict what they need even before they need it by using predictive analytics.
Amazon Go. Unlike other stores, there is no checkout required. The stores have artificial
intelligence technology that tracks what items you pick up and then automatically charges you for
those items through the Amazon Go app on your phone. Since there is no checkout, you bring your
own bags to fill up with items, and there are cameras watching your every move to identify every
item you put in your bag to ultimately charge you for it.
80
LESSON 9
82. data & content design
WHAT IS NETFLIX?
82
Netflix's initial business model included DVD sales and rental by mail, but
Hastings abandoned the sales about a year after the company's founding to
focus on the initial DVD rental business. Netflix expanded its business in 2010
with the introduction of streaming media while retaining the DVD and Blu-ray
rental business. The company expanded internationally in 2010 with streaming
available in Canada, followed by Latin America and the Caribbean. Netflix
entered the content-production industry in 2012, debuting its first series
Lilyhammer.
▸ https://en.wikipedia.org/wiki/Netflix
LESSON 9
83. data & content design
A NEW COURSE
83
In 2006, Netflix launched an unusual, and highly successful, competition
designed to improve its recommendation system. It released a database of 100
million movie and TV show ratings from nearly 500,000 users and, in 2009,
awarded the $1 million jackpot the first team to increase the accuracy of its own
movie recommendation algorithm by more than 10 percent.
The rapid rise of streaming content has exploded the amount and types of data
available to the company’s data science team.
▸ https://www.technologyreview.com/s/428867/why-there-wont-be-a-netflix-prize-sequel/
LESSON 9
84. data & content design
HOUSE OF CARDS
84
Netflix decided to make its original programming bet on House of Cards,
specifically, based on what it knows about the viewing habits of its users—it knew
which and how many users watch movies starring Kevin Spacey and the director
David Fincher, and, through its tagging and recommendation system, how many
sat through other similar political dramas. It has shown different trailers to people
depending on their particular viewing habits, too.
▸ https://www.technologyreview.com/s/511771/house-of-cards-and-our-future-of-algorithmic-programming/
LESSON 9
85. data & content design
EVENTS TRACKED
85
▸ When you pause, rewind, or fast forward
▸ What day you watch content (Netflix has found people watch TV shows
during the week and movies during the weekend.)
▸ The date you watch
▸ What time you watch content
▸ Where you watch (zip code)
LESSON 9
86. data & content design
EVENTS TRACKED
86
▸ What device you use to watch (Do you like to use your tablet for TV shows and
your Roku for movies? Do people access the Just for Kids feature more on their
iPads, etc.?)
▸ When you pause and leave content (and if you ever come back)
▸ The ratings given (about 4 million per day)
▸ Searches (about 3 million per day)
▸ Browsing and scrolling behavior
▸ https://neilpatel.com/blog/how-netflix-uses-analytics/
LESSON 9
87. data & content design
5 USE CASES OF AI/DATA/MACHINE LEARNING AT NETFLIX
87
▸ Personalization of Movie Recommendations
▸ Auto-Generation and Personalization of Thumbnails / Artwork
▸ Location Scouting for Movie Production (Pre-Production)
▸ Movie Editing (Post-Production)
▸ Streaming Quality
▸ https://becominghuman.ai/how-netflix-uses-ai-and-machine-learning-a087614630fe
LESSON 9
88. data & content design
PERSONALIZED IMAGE THUMBNAIL / ARTWORK
88
▸ Problem: How (and when) do we best present that movie recommendation to the user in a way that
maximizes viewership and monthly subscriber loyalty?
▸ Users spent an average of 1.8 seconds considering each title they were presented with while on Netflix
▸
LESSON 9