SlideShare une entreprise Scribd logo
1  sur  12
avaya.com




  WHAT ARE THE BENEFITS                                               Echoes: A Collaborative
  OF COLLABORATIVE
  TAGGING OF                                                          Tagging System for
  CONVERSATIONS?                                                      Conversations in the
  •	 Identify	significant	conversations:	Our	system	will	identify	
     important	conversations	that	may	be	useful	to	the	enterprise.	
                                                                      Enterprise
  •	 Identify	expert	communities:	If	one	thinks	of	tags	as	
     “topics”,	the	analysis	of	people	and	tags	gives	an	idea	of	
     expert	communities,	i.e.	the	networks	of	people	that	grow	
     around	a	tag.	

  •	 Enable	communication	analysis:	We	can	validate	imposed	
     groups	and	hierarchies	(e.g.	Sales,	Human	Resources)	
     against	real	world	conversational	practices.	One	can	study	
     how	well	these	groups	are	functioning,	what	are	their	
     communication	patterns,	and	do	existing	communication	
     flows	meet	expectations.	

  •	 Enable	rich	media	search	and	retrieval:	We	can	search	
     and	retrieve	audio	and	video	conversational	artifacts	
     using	the	tags.	Tags	can	be	used	to	point	to	portions	of	
     audio	and	video	and	be	used	in	conjunction	with	ASR	
     and	wordspottting	techniques	for	enhanced	search	and	
     retrieval	for	rich	media.	



Table of Contents                                                     Abstract
                                                                      Social media websites like Flickr and del.icio.us enable collaboration by allowing
Abstract ...................................................... 1     users to easily share content on the web through tagging. To provide a similar
                                                                      advantage to the enterprise, we have designed a tagging system called Echoes for
Introduction ............................................... 2
                                                                      audio conversations. We are developing telephonic interfaces, where participants
Background ............................................... 2          of a spoken conversation can opt to archive and share it. We have also developed a
                                                                      web-based visual interface that enables annotation, search, and retrieval of archived
Challenges ................................................. 3        conversations. In this interface, we have focused on visualizing relationships as a
                                                                      social network of users, tags and conversations, which will enable efficient searching
Echoes ........................................................ 4     and browsing, and more importantly, provide contextualized interaction histories.
                                                                      A first pilot user study, conducted using simulated data, showed that our social
Results ........................................................ 9    network visualization was effective in identifying patterns of communication. Analysis
                                                                      of tags contributed for actual enterprise meetings in subsequent studies showed
Conclusions and Future Work .................11
                                                                      that the tags can help in audio search and retrieval and perform functions such as
                                                                      describing and summarizing the audio.


WHITE PAPER                                                                                                                                                   1
avaya.com




Introduction
This white paper focuses on the problem of enhancing conversations in the enterprise through social media. We have
designed a system that enables persistent (audio) conversations in the enterprise, primarily by the use of the Web 2.0
technique of tagging. Tagging is widely used as it generates folksonomies that describe, classify and distinguish the
tagged content.

Persistent conversations can be stored, accessed and retrieved, due to which new possibilities for acting and doing
arise, e.g., an individual can enhance his future conversations by being able to reflect on his past interactions and
share his past conversations with other users. Persistence enables us to search, browse, replay, annotate, visualize,
restructure and recontextualize conversations 2. In particular, we have designed a visualization that allows users
to search, browse, find, tag and listen to (stored) conversations. Additionally, the visualization allows discovery of
relevant communities as well as relationships and topics of discussion within them.



Background
The advent of social media under the umbrella of Web 2.0 and its enterprise counterpart Enterprise 2.0          8


is breaking down perceived barriers to collaboration and the sharing of content (e.g. the sharing of personal
photographs and video on Flickr and Youtube). With a view towards extending the kind of rich user experiences
provided by social media on the web to the enterprise, we have designed and implemented Echoes, a system
for users to store conversations, tag them and share them with other users. We use the Web 2.0 technology
of “collaborative tagging” to do this because of the sheer number of ways in which tags can be used, e.g., as
descriptors, opinions, reminders for action and even for mutual awareness          .
                                                                                 3,7




In an enterprise, conversations are the most important resource for collaborative work. Through conversations, knowledge
is created and shared 9. Team-members rely on a variety of channels to converse: face-to-face, email, IM, voice etc. While
some of these channels such as email routinely provide for storage, retrieval, and sharing (through copying, forwarding
etc.), others such as voice generally do not support these practices. Some efforts towards this have been made in projects
described in footnotes in the References section on page eleven of this paper. 10, 11

In Echoes, participants in a (audio) conversation can choose to archive it. Groups of users can be granted access
to the conversation by the participants e.g. a meeting between a subset of people in a department may be deemed
relevant to the rest of the department. A conversation deemed “public” can be listened to and tagged by others in
the enterprise. Figure 1 depicts the functionalities that Echoes offers.




                                                                                                                          2
avaya.com




                                                                   Private
                                                                conversations

                                               Public
                                            conversations




                                             Archive &           Listen &
                                                Tag                 Tag




                        Figure 1: Functional overview of archiving and tagging conversations.



Challenges
In the long term, Echoes must be integrated into conversational practices in the enterprise. These span several
aspects, including, but not limited to, the actual role that conversations play in the accomplishment of work in
the enterprise, the creation of a collective organizational memory   1
                                                                         that a knowledge worker in the enterprise can
draw upon and the issue of privacy. We concede that recording audio conversations presents a thorny privacy
issue, despite the potential advantages of doing so such as catching up on missed meetings and sharing relevant
snippets of meetings with people. It is our belief that as users in enterprises begin to see the benefits of persistent
audio conversations, any adverse reaction to recording conversations will be mitigated, provided, of course,
that robust privacy safeguards are implemented. A full deployment of Echoes will include authentication and
authorization controls. Participants will control access to recorded conversations using privacy setting options.

We have identified the following immediate challenges that the interface must address.

1.	Users	must	be	able	to	query	for	conversations. Querying could be on the basis of people (i.e. users) or topics (i.e.
  tags, descriptions etc).

2.	Users	must	be	able	to	tag	a	conversation,	a group of conversations or part of a conversation.

3.	Users	must	be	able	to	listen	to	a	conversation	as	well	as	access	all	relevant	information	about	it such as the names
  of the participants, tags, and taggers.

4.	Users	must	be	able	to	use	the	interface	to	deduce	histories	and	interactions	between	different	participants, and their
  relationships with different topics, and with each other.




                                                                                                                            3
avaya.com




We have attempted to address the above challenges in our current visualization. We present Echoes in detail in the
next section.



Echoes
We have implemented the conferencing part of our preliminary system on the Avaya Conferencing Bridge (Meeting
Exchange). A registered user is assigned a host/participant code and can join a conference call using a single access
number. To record a call, the host presses *2 and then inputs a 4 digit file-number that will be used to name the audio
recording. The audio recording along with the names of the participants is then imported into our database.

We have designed and developed a web-based visual interface for tagging and retrieving conversations that are conducted
on the conference bridge. The over-all visual interface is shown in Figure 2 (See Figure 8 at the end of the paper
for a larger-sized image). It consists of three sub-views: (a) the Social Network View, (b) the Details View and (c) the
Conversation List View. The Social Network View allows a user to figure out interaction histories. The Details View reveals
details about a particular conversation, tag or user. The List View enables a user to view lists of conversations.


                                             Social Network           Details View
                                              View




                                                                      List View




                                Figure 2: Visual Interface and its three component views

The rationale for these three views is as follows. People are familiar with ‘Lists’ in other persistent conversation
channels such as emails and IM. Lists can be scanned quickly and can be sorted. The lower-right List View
provides this functionality.

The upper-right Details View functions like the “preview” panel in email clients like Outlook. It provides the details
of a selected conversation (in terms of participants, tags and taggers). The selected conversation can be tagged
through the text field provided. The Details View also provides details for a selected Tag or User.

We attempted to do something slightly different in the Social Network View. The essential content of this view is
the same as that of the List View: a set of conversations. However the representation is different: here, we attempt
to show the relationships between various conversations, tags and users. We have suggested a computational model
for tagged conversations elsewhere 5, which is reflected in the social network view.




                                                                                                                           4
avaya.com




The Social Network View is shown in Figure 3 (see Figure 8 for a detailed view). On the top left, a user can submit
keywords such as the names of users or tags. A set of conversations matching the keywords is retrieved and displayed.




                                                                  Conversation
                                                                  Particle
                                             Search Box




                                                                              Query
                                                                              Particle




                                    Figure 3: Features of the Social Network View

The conversations are represented as particles freely moving within the space, with variations in the color tint,
indicating their “age”. Recent conversations are bolder while older conversations appear faded. Details of a
conversation including the audio recording of the conversation can be easily obtained by clicking on a particle.
The Details window (upper-right) refreshes when a particle is clicked, displaying the conversation details as well as
playing the audio file associated with that conversation.

Along with the conversations, the Social Network View displays users and tags at the bottom: these users and tags
are associated with the conversations returned by Echoes in response to the query. When one of these users or tags
is selected, a “query particle” is introduced into the search space (see Figure 4). Conversations that are associated
with this query get attracted to it. A conversation is associated with a user if (a) she has participated in it, or (b)
she has tagged it. A conversation is associated with a tag if the tag has been applied to the conversation. Lines are
drawn between queries and their associated conversations. To indicate different associations (i.e. participation and
tagging) the lines are rendered in different colors.




                                                                                                                          5
avaya.com




                                              Query Ajita
                                              Introduced




                                              Shreeharsh         Conversations
                                              introduced         related to both
                                                                 queries




                                       Figure 4: Steps taken to refine a query

When multiple queries are introduced, the conversation particles re-arrange themselves so that they are closest to
the queries they are related with. Conversations are color coded depending on how many queries they are related to
(In Figure 4, blue, green, and orange particles are associated with 1 query, 2 queries, and 3 queries, respectively).
Along with the rearrangement of the conversations in the Social Network View, the List View also updates itself
when queries are introduced. The conversations in the List View get sorted and color-coded depending on the
number of queries they are associated with.

We now consider some scenarios to illustrate how users may utilize the visual interface. While these scenarios
are not exhaustive with regards to the types of things that users may use the interface for, they will help put our
visualization into perspective.




                                                                                                                      6
avaya.com




Finding the People to Consult on a Particular Topic
Adithya, a new employee, wants to find the identities of people working on social networks research in the
enterprise. He types “social networks” and about 100 conversations appear in the search space. From the row
of users at the bottom, he sees that Ajita is associated with more conversations than any other user. When he
clicks on Ajita (see Figure 5), the user “Ajita” is introduced into the Social Network view and a large number of
conversations cluster around her.

To find out more, he adds two more people to the search space: Shreeharsh and Reinhard (two names he
recognizes from his limited time working in the department). He now sees that most of the conversations that
Shreeharsh has had on “Social Networks” are with Ajita and furthermore, Ajita and Reinhard seem to have a
number of conversations between them. He adds user “Doree” into the search space and sees that Ajita has
had conversations with Doree on Social Networks too. Adithya infers that Ajita is probably an authority on Social
Networks. He listens to some of the conversations that Ajita has participated in and understands that other
colleagues have consulted her for advice. He saves Ajita’s contact (right-clicking on Ajita pulls up her contact
information) for future use as he might need to discuss his research plans on Social Networks with her. He also
notes that the best way to contact Ajita would be through Reinhard or Shreeharsh, both of whom he knows. Note
that the visualization can be enhanced with communication to allow Adithya to access interfaces to send Ajita an
email/IM or call her through a softphone.




           Figure 5: Social Network View indicating that the user Ajita may be an authority on the topic

                   “Social Networks” as three other users converse with her regularly on that topic.

In another scenario, Mary is preparing for an important presentation to her new client, Company X. She wants to
make her presentation more effective by aligning her points with her client’s core business interests. She decides to
search for a community that is interested in Company X. related to their work with this client.




                                                                                                                        7
avaya.com




                                                                  Company X




Figure 6: Details View and Social Network View depicting top users who are associated with the tag “Company X”.

She types “Company X” in the main search text field; she sees that the tag has been used before by quite a few people.
The Details View shows that the tag “Company X” has been used most frequently by two people. She introduces them
as well as the tag “Company X” as queries in her search space and finds they have archived discussions related to their
work with this client. Mary wonders if that they could give her some valuable feedback on her presentation and give
insights that might help while working with Company X. She requests a short meeting with them to discuss her work and
presentation (see Figure 6).

Discovering Hidden Connections in the Community
In another scenario, a user John, who has archived several conversations, types his own name in the keyword
search to retrieve them. A row of associated people appears at the bottom of the search space. John sees the
names of a few of his colleagues that he regularly meets with. Additionally, John notices a user – Lynne - whom
he does not recognize. Curious to see what his relationship with Lynne is, he introduces two users into the search
space, John (i.e. himself) and Lynne. A quick glance at the visualization tells him this: Lynne has been actively
tagging several of his conversations (see Figure 7).




                                                                                                                          8
avaya.com




 Figure 7: Social Network View showing Lynne has tagged many of John’s conversations, indicating that they may

                                                have common interests.

John finds out (from the tags) that these conversations are mainly regarding a technical problem John’s group has
been grappling with. Perhaps Lynne’s development team has been having similar problems and could provide John
with valuable input. John sends Lynne a short email and her reply confirms John’s suspicion. They share their
documentation and links and decide to arrange a joint group meeting between their corresponding groups to tackle
the problem.



Results
When the first prototype of Echoes was completed, we conducted a small pilot study with 4 users to test
the effectiveness of our visual interface. We simulated conversation data by constructing patterns of inter-
communication and tagging. The questions in the study were about the relationships that our subjects were able
to observe between users, tags and conversations. We were more interested in the methods used to arrive at the
answers rather than the accuracy of the answers (e.g. how they queried for conversations, what sub-queries did
they use etc). For this reason we asked our subjects to provide an account of how they went about answering every
question. To supplement this account, we also recorded their interactions with Echoes using a screen-capture
program. We asked our subjects four sets of questions with 2 to 4 questions in each set. Questions ranged from
simple ones like “How many conversations did Jane participate on XX/XX/XX?” to slightly difficult ones like “Can
you estimate how many conversations Jack participated in between XX/XX/XX and XX/XX/XX”.

The response we obtained from the study was extremely encouraging. The participants began to try and develop
strategies while using a combination of sub-queries and search keywords to obtain appropriate conversations. What
became clear was that many of our questions would have been difficult to answer without the Social Network View
and our subjects tended to interact with this view the most. The “counting” questions (see above) were the hardest
since some of them did not have obvious answers; in this case, our subjects tended to ask questions about some




                                                                                                                   9
avaya.com




facet of the interface (what does this query do? what does this line mean?), thus revealing their pre-conceptions
about some aspect of the interface. This revealed to us which aspects of our design worked well and which did not.

More importantly, our subjects gave us valuable suggestions. In our interface, people are associated with
conversations depending on whether they are taggers or participants, however the study showed that a clearer
distinction (between a person who simply tagged a conversation versus someone who participated in it) was
preferred. The study also showed that there is a need to emphasize who applied a tag rather than the tag alone,
since the application of the tag indicates something of interest about the tagger. The feedback and the screen
capture will help guide work in the future.

We then selected five teams from our organization, and with their permission, started recording their scheduled audio
conversations. All these five teams were distributed, with members from all over the US (and in one case, with members
from China). Our data collection is essentially an ongoing process and we hope to add more teams and continue to add
audio conversations to our repository, even as we add more features to Echoes. To further enrich the content, we have
added 15 podcasts about topics related to the meetings to the Echoes system.

We conducted a second user study with a team of 5 members for a period of five days. We asked them to participate in a
game of “tag”. The purpose of the game was to get them to tag the audio conversations and comment on tags that others
had used. Each member’s tags were visible to others in the same group. We asked the users to tag their own meetings as
well as some of the podcasts (total of 20 meetings). 252 tags were contributed in the study.

We performed a detailed content analysis of the collected tags. We first looked at whether the tags could have
been obtained automatically, using ASR techniques such as word-spotting. Word-spotting requires a vocabulary of
standard terms. Less than 10% of our tags were standard (as checked against Wikipedia). Most of them, on the
contrary, were “hybrid”. By hybrid tags, we mean tags like “Microsoft bids for yahoo” “discussionabout-proposed-
slideset” – phrases that do not occur in their literal form in the audio. It is evident that tags contributed by a
community have great value as they capture concepts within the audio that could not have been easily extracted
through programmatic means.

Next, we classified the tags that users contributed in the study. Remember that no restrictions were imposed on
the types of tags that users could come up with, and no recommendations were given. The tag classification is
built around two aspects:

1.	The	affordances	the	tags	give	to	the	audio: We looked at the affordances that the tags confer on the audio. We
  found affordances such as description, summary, comments, markers (i.e. marking a specific point or points on
  the audio timeline, etc), and turn-taking. We refer to this aspect as the “function” of a tag. These affordances
  reveal that tags can be used for fine-grained searching within audio and answer questions such as “what were
  these meetings about?” or “what did user X say about topic Y in that presentation” etc.

2.	The	entities	that	the	tags	refer	to: The tags invariably seem to refer to different entities (such as people, groups,
  technologies, activities, events, organizations, etc) which are related to the conversation (which may or may
  not be mentioned in the conversation itself). By connecting the conversation to this wide network of related
  entities, the tags serve to index the different conversations, and relate them to each other. They also make the
  conversation more accessible since a number of keywords now lead to specific conversations.



                                                                                                                        10
avaya.com




Conclusions and Future Work
In this paper, we presented Echoes a system that allows users to annotate, search, browse, and retrieve past
conversations, to use as resources for individual and collaborative work. The visual interface in Echoes depicts
relationships between users, tags and conversations as a social network. We conducted a pilot study to test the
effectiveness of our social network visualization and to study how users utilize the resources provided by the
interface (querying, browsing) to accomplish certain tasks. Analysis of tags contributed for actual enterprise
meetings in subsequent studies showed that the tags can help in audio search and retrieval and perform functions
such as describing and summarizing the audio.

A number of interface augmentations are currently being designed to aid functionality of the system. Some
of these are the ability to initiate telephone conversations through the interface and audio skim through long
conversations. We plan to actively deploy our system for persistent conversations. Particularly, we want to see how
users adapt it to their use, to accomplish their day-to-day tasks. Feedback from the users can tell us how effective
the system is and suggest pointers for improvement of its design. We also plan to work further on a computational
model for tagged conversations, which can help us compute aspects of conversations like relevance to a certain
query, for a certain user or community.

References
1
    Ackerman, M. S. and Halverson, C. (1999). Organizational Memory: Processes, Boundary Objects and Trajectories, Hawaii International
    Conference on the System Sciences, Hawaii, USA.
2
    Erickson, T. (1999). Persistent Conversation: An Introduction. Journal of Computer-Mediated Communication: Special Issue on Persistent Conversation 4(4).
3
    Golder, S. and Huberman, B. (2006). Usage Patterns of Collaborative Tagging Systems. Journal of Information Science 32(2): 198-208.
4
    John, A. and Seligmann, D. (2006). Collaborative Tagging and Expertise in the Enterprise, WWW 2006: Collaborative Tagging Workshop,
    Edinburgh, Scotland.
5
    John, A., Kelkar, S., Peebles, E., Renduchintala, A. and Seligmann, D. (2007). Collaborative Tagging and Persistent Audio Conversations,
    European Conference on Computer Supported Cooperative Work (ECSCW 2007): Workshop on CSCW and Web 2.0, Limerick, Ireland, 25th
    September 2007.
6
    Kelkar, S., John, A. and Seligmann, D. (2007). An Activity-based Perspective of Collaborative Tagging, International Conference on Weblogs and
    Social Media, Boulder, Colorado.
7
    Marlow, C., Naaman, M., Boyd, D. and Davis, M. (2006). Position Paper, Tagging, Taxonomy, Flickr, Article, ToRead., WWW 2006: Collaborative
    Tagging Workshop, Edinburgh, Scotland.
8
    McAfee, A. P. (2006). Enterprise 2.0: The Dawn of Emergent Collaboration. MIT Sloan Management Review 47(3): 21-28.
9
    Orr, J. E. (1986). Narratives at Work: Story Telling as Cooperative Diagnostic Activity, Computer-Supported Cooperative Work, Austin, Texas.
10
     Tucker, S., Bergman, O., Ramamoorthy, A., Whittaker, S. (2010), Catchup: A Useful Application of Time-Travel in Meetings. Proceedings of the
     2010 ACM Conference on Computer Supported Cooperative Work (CSCW 2010), Savannah, Georgia, February 6-10, 2010.
11
     Kalnikaité, V., Whittaker, S. (2008), Social summarization: Does Social Feedback Improve Access to Speech Data? Proceedings of the 2008 ACM
     conference on Computer Supported Cooperative Work, San Diego, CA, November 8-12, 2008.

This paper was created by Avaya Labs authored by: Ajita John, Shreeharsh Kelkar, Adithya Renduchintala, Dorée Duncan Seligmann




                                                                                                                                                                11
Details of the Search Options



                                        List of Clickable People and Tags
                                                                                                                                     Details of the Tag Field and Audio
                                                                                       Information displayed on moving
                                                                                                                                     player
                                                                                       the mouse over a conversation




                                   Figure 8: The full screenshot of the interface. The Social Network View is on the left, the Details View of the selected

                                           conversation (orange in the Social Network View) on the top right, and the List View is on the bottom right.




About Avaya
Avaya is a global leader in enterprise communications systems. The company
provides unified communications, contact centers, and related services directly
and through its channel partners to leading businesses and organizations
around the world. Enterprises of all sizes depend on Avaya for state-of-the-art
communications that improve efficiency, collaboration, customer service and
competitiveness. For more information please visit www.avaya.com.


© 2010 Avaya Inc. All Rights Reserved.
Avaya and the Avaya Logo are trademarks of Avaya Inc. and are registered in the United States and other countries.
All trademarks identified by ®, TM or SM are registered marks, trademarks, and service marks, respectively, of Avaya Inc.                   avaya.com
All other trademarks are the property of their respective owners. Avaya may also have trademark rights in other terms used herein.
References to Avaya include the Nortel Enterprise business, which was acquired as of December 18, 2009.
03/10 • MIS4482

Contenu connexe

Similaire à Avaya Collaborative Tagging System by PacketBase

Groupware Technology Project Report
Groupware Technology Project ReportGroupware Technology Project Report
Groupware Technology Project ReportBharat Kalia
 
Implementation - Communote Enterprise Microblogging (engl.)
Implementation - Communote Enterprise Microblogging (engl.)Implementation - Communote Enterprise Microblogging (engl.)
Implementation - Communote Enterprise Microblogging (engl.)Communote GmbH
 
Jive Clearspace Best#2598 C8
Jive  Clearspace  Best#2598 C8Jive  Clearspace  Best#2598 C8
Jive Clearspace Best#2598 C8mrshamilton1b
 
Interactive Video Search and Browsing Systems
Interactive Video Search and Browsing SystemsInteractive Video Search and Browsing Systems
Interactive Video Search and Browsing SystemsAndrea Ferracani
 
Enterprise 2.0 in practice
Enterprise 2.0 in practiceEnterprise 2.0 in practice
Enterprise 2.0 in practiceFinnur Magnusson
 
Administrator tutorial 2011
Administrator tutorial 2011Administrator tutorial 2011
Administrator tutorial 2011eCairn Inc.
 
IBM Lotus Connections Overview
IBM Lotus Connections OverviewIBM Lotus Connections Overview
IBM Lotus Connections Overviewal95iii
 
Social Networking 101 - Sept. 2008 Lunch & Learn
Social Networking 101 - Sept. 2008 Lunch & LearnSocial Networking 101 - Sept. 2008 Lunch & Learn
Social Networking 101 - Sept. 2008 Lunch & LearnPatty Lewis
 
Exploiting semantics-in-collaborative-software-development-tasks
Exploiting semantics-in-collaborative-software-development-tasksExploiting semantics-in-collaborative-software-development-tasks
Exploiting semantics-in-collaborative-software-development-tasksDimitris Panagiotou
 
The Architecture Of Future Websites by Luristic
The Architecture Of Future Websites by LuristicThe Architecture Of Future Websites by Luristic
The Architecture Of Future Websites by LuristicLuristic
 
Session 2: Social Software
Session 2: Social SoftwareSession 2: Social Software
Session 2: Social SoftwareChris Sparshott
 
User tutorial 2011
User tutorial 2011User tutorial 2011
User tutorial 2011eCairn Inc.
 
Overview Clearvale - The Social Business Cloud
Overview Clearvale - The Social Business CloudOverview Clearvale - The Social Business Cloud
Overview Clearvale - The Social Business CloudBroadVision
 
Sakai 3, Architectural Choices and Community Impact
Sakai 3, Architectural Choices and Community ImpactSakai 3, Architectural Choices and Community Impact
Sakai 3, Architectural Choices and Community ImpactAuSakai
 
Development with TYPO3 5.0
Development with TYPO3 5.0Development with TYPO3 5.0
Development with TYPO3 5.0Robert Lemke
 
collaborative tagging :-by Er. Priyanka Pradhan
collaborative tagging :-by Er. Priyanka Pradhancollaborative tagging :-by Er. Priyanka Pradhan
collaborative tagging :-by Er. Priyanka PradhanPriyanka Pradhan
 

Similaire à Avaya Collaborative Tagging System by PacketBase (20)

188-tagging.pdf
188-tagging.pdf188-tagging.pdf
188-tagging.pdf
 
Groupware Technology Project Report
Groupware Technology Project ReportGroupware Technology Project Report
Groupware Technology Project Report
 
Implementation - Communote Enterprise Microblogging (engl.)
Implementation - Communote Enterprise Microblogging (engl.)Implementation - Communote Enterprise Microblogging (engl.)
Implementation - Communote Enterprise Microblogging (engl.)
 
Jive Clearspace Best#2598 C8
Jive  Clearspace  Best#2598 C8Jive  Clearspace  Best#2598 C8
Jive Clearspace Best#2598 C8
 
Interactive Video Search and Browsing Systems
Interactive Video Search and Browsing SystemsInteractive Video Search and Browsing Systems
Interactive Video Search and Browsing Systems
 
Interactive Video Search and Browsing Systems
Interactive Video Search and Browsing SystemsInteractive Video Search and Browsing Systems
Interactive Video Search and Browsing Systems
 
Enterprise 2.0 in practice
Enterprise 2.0 in practiceEnterprise 2.0 in practice
Enterprise 2.0 in practice
 
Administrator tutorial 2011
Administrator tutorial 2011Administrator tutorial 2011
Administrator tutorial 2011
 
IBM Lotus Connections Overview
IBM Lotus Connections OverviewIBM Lotus Connections Overview
IBM Lotus Connections Overview
 
Social Networking 101 - Sept. 2008 Lunch & Learn
Social Networking 101 - Sept. 2008 Lunch & LearnSocial Networking 101 - Sept. 2008 Lunch & Learn
Social Networking 101 - Sept. 2008 Lunch & Learn
 
Exploiting semantics-in-collaborative-software-development-tasks
Exploiting semantics-in-collaborative-software-development-tasksExploiting semantics-in-collaborative-software-development-tasks
Exploiting semantics-in-collaborative-software-development-tasks
 
The Architecture Of Future Websites by Luristic
The Architecture Of Future Websites by LuristicThe Architecture Of Future Websites by Luristic
The Architecture Of Future Websites by Luristic
 
Session 2: Social Software
Session 2: Social SoftwareSession 2: Social Software
Session 2: Social Software
 
User tutorial 2011
User tutorial 2011User tutorial 2011
User tutorial 2011
 
Overview Clearvale - The Social Business Cloud
Overview Clearvale - The Social Business CloudOverview Clearvale - The Social Business Cloud
Overview Clearvale - The Social Business Cloud
 
sm@jgc Session Three
sm@jgc Session Threesm@jgc Session Three
sm@jgc Session Three
 
Sakai 3, Architectural Choices and Community Impact
Sakai 3, Architectural Choices and Community ImpactSakai 3, Architectural Choices and Community Impact
Sakai 3, Architectural Choices and Community Impact
 
Connections2.5
Connections2.5Connections2.5
Connections2.5
 
Development with TYPO3 5.0
Development with TYPO3 5.0Development with TYPO3 5.0
Development with TYPO3 5.0
 
collaborative tagging :-by Er. Priyanka Pradhan
collaborative tagging :-by Er. Priyanka Pradhancollaborative tagging :-by Er. Priyanka Pradhan
collaborative tagging :-by Er. Priyanka Pradhan
 

Plus de PacketBase, Inc.

Avaya One-X Mobile SIP for Apple iOS by PacketBase
Avaya One-X Mobile SIP for Apple iOS by PacketBaseAvaya One-X Mobile SIP for Apple iOS by PacketBase
Avaya One-X Mobile SIP for Apple iOS by PacketBasePacketBase, Inc.
 
Avaya Healthcare Solutions by PacketBase
Avaya Healthcare Solutions by PacketBaseAvaya Healthcare Solutions by PacketBase
Avaya Healthcare Solutions by PacketBasePacketBase, Inc.
 
Avaya 1XC Mobile SIP for Apple
Avaya 1XC Mobile SIP for AppleAvaya 1XC Mobile SIP for Apple
Avaya 1XC Mobile SIP for ApplePacketBase, Inc.
 
Avaya web alive by PacketBase
Avaya web alive by PacketBaseAvaya web alive by PacketBase
Avaya web alive by PacketBasePacketBase, Inc.
 
Avaya and Skype Connect PacketBase is an Avaya BusinessPartner
Avaya and Skype Connect PacketBase is an Avaya BusinessPartnerAvaya and Skype Connect PacketBase is an Avaya BusinessPartner
Avaya and Skype Connect PacketBase is an Avaya BusinessPartnerPacketBase, Inc.
 
Avaya VoIP on Cisco Best Practices by PacketBase
Avaya VoIP on Cisco Best Practices by PacketBaseAvaya VoIP on Cisco Best Practices by PacketBase
Avaya VoIP on Cisco Best Practices by PacketBasePacketBase, Inc.
 
Avaya Aura Five Nines by PacketBase
Avaya Aura Five Nines by PacketBaseAvaya Aura Five Nines by PacketBase
Avaya Aura Five Nines by PacketBasePacketBase, Inc.
 

Plus de PacketBase, Inc. (8)

Avaya One-X Mobile SIP for Apple iOS by PacketBase
Avaya One-X Mobile SIP for Apple iOS by PacketBaseAvaya One-X Mobile SIP for Apple iOS by PacketBase
Avaya One-X Mobile SIP for Apple iOS by PacketBase
 
Why Do I Need an SBC
Why Do I Need an SBCWhy Do I Need an SBC
Why Do I Need an SBC
 
Avaya Healthcare Solutions by PacketBase
Avaya Healthcare Solutions by PacketBaseAvaya Healthcare Solutions by PacketBase
Avaya Healthcare Solutions by PacketBase
 
Avaya 1XC Mobile SIP for Apple
Avaya 1XC Mobile SIP for AppleAvaya 1XC Mobile SIP for Apple
Avaya 1XC Mobile SIP for Apple
 
Avaya web alive by PacketBase
Avaya web alive by PacketBaseAvaya web alive by PacketBase
Avaya web alive by PacketBase
 
Avaya and Skype Connect PacketBase is an Avaya BusinessPartner
Avaya and Skype Connect PacketBase is an Avaya BusinessPartnerAvaya and Skype Connect PacketBase is an Avaya BusinessPartner
Avaya and Skype Connect PacketBase is an Avaya BusinessPartner
 
Avaya VoIP on Cisco Best Practices by PacketBase
Avaya VoIP on Cisco Best Practices by PacketBaseAvaya VoIP on Cisco Best Practices by PacketBase
Avaya VoIP on Cisco Best Practices by PacketBase
 
Avaya Aura Five Nines by PacketBase
Avaya Aura Five Nines by PacketBaseAvaya Aura Five Nines by PacketBase
Avaya Aura Five Nines by PacketBase
 

Dernier

Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...apidays
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024The Digital Insurer
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusZilliz
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfsudhanshuwaghmare1
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Orbitshub
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodJuan lago vázquez
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FMESafe Software
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century educationjfdjdjcjdnsjd
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...Zilliz
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingEdi Saputra
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontologyjohnbeverley2021
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Victor Rentea
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native ApplicationsWSO2
 
[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdfSandro Moreira
 
CNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In PakistanCNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In Pakistandanishmna97
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc
 
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Victor Rentea
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesrafiqahmad00786416
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsNanddeep Nachan
 

Dernier (20)

Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with Milvus
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontology
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
 
[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
CNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In PakistanCNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In Pakistan
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
 

Avaya Collaborative Tagging System by PacketBase

  • 1. avaya.com WHAT ARE THE BENEFITS Echoes: A Collaborative OF COLLABORATIVE TAGGING OF Tagging System for CONVERSATIONS? Conversations in the • Identify significant conversations: Our system will identify important conversations that may be useful to the enterprise. Enterprise • Identify expert communities: If one thinks of tags as “topics”, the analysis of people and tags gives an idea of expert communities, i.e. the networks of people that grow around a tag. • Enable communication analysis: We can validate imposed groups and hierarchies (e.g. Sales, Human Resources) against real world conversational practices. One can study how well these groups are functioning, what are their communication patterns, and do existing communication flows meet expectations. • Enable rich media search and retrieval: We can search and retrieve audio and video conversational artifacts using the tags. Tags can be used to point to portions of audio and video and be used in conjunction with ASR and wordspottting techniques for enhanced search and retrieval for rich media. Table of Contents Abstract Social media websites like Flickr and del.icio.us enable collaboration by allowing Abstract ...................................................... 1 users to easily share content on the web through tagging. To provide a similar advantage to the enterprise, we have designed a tagging system called Echoes for Introduction ............................................... 2 audio conversations. We are developing telephonic interfaces, where participants Background ............................................... 2 of a spoken conversation can opt to archive and share it. We have also developed a web-based visual interface that enables annotation, search, and retrieval of archived Challenges ................................................. 3 conversations. In this interface, we have focused on visualizing relationships as a social network of users, tags and conversations, which will enable efficient searching Echoes ........................................................ 4 and browsing, and more importantly, provide contextualized interaction histories. A first pilot user study, conducted using simulated data, showed that our social Results ........................................................ 9 network visualization was effective in identifying patterns of communication. Analysis of tags contributed for actual enterprise meetings in subsequent studies showed Conclusions and Future Work .................11 that the tags can help in audio search and retrieval and perform functions such as describing and summarizing the audio. WHITE PAPER 1
  • 2. avaya.com Introduction This white paper focuses on the problem of enhancing conversations in the enterprise through social media. We have designed a system that enables persistent (audio) conversations in the enterprise, primarily by the use of the Web 2.0 technique of tagging. Tagging is widely used as it generates folksonomies that describe, classify and distinguish the tagged content. Persistent conversations can be stored, accessed and retrieved, due to which new possibilities for acting and doing arise, e.g., an individual can enhance his future conversations by being able to reflect on his past interactions and share his past conversations with other users. Persistence enables us to search, browse, replay, annotate, visualize, restructure and recontextualize conversations 2. In particular, we have designed a visualization that allows users to search, browse, find, tag and listen to (stored) conversations. Additionally, the visualization allows discovery of relevant communities as well as relationships and topics of discussion within them. Background The advent of social media under the umbrella of Web 2.0 and its enterprise counterpart Enterprise 2.0 8 is breaking down perceived barriers to collaboration and the sharing of content (e.g. the sharing of personal photographs and video on Flickr and Youtube). With a view towards extending the kind of rich user experiences provided by social media on the web to the enterprise, we have designed and implemented Echoes, a system for users to store conversations, tag them and share them with other users. We use the Web 2.0 technology of “collaborative tagging” to do this because of the sheer number of ways in which tags can be used, e.g., as descriptors, opinions, reminders for action and even for mutual awareness . 3,7 In an enterprise, conversations are the most important resource for collaborative work. Through conversations, knowledge is created and shared 9. Team-members rely on a variety of channels to converse: face-to-face, email, IM, voice etc. While some of these channels such as email routinely provide for storage, retrieval, and sharing (through copying, forwarding etc.), others such as voice generally do not support these practices. Some efforts towards this have been made in projects described in footnotes in the References section on page eleven of this paper. 10, 11 In Echoes, participants in a (audio) conversation can choose to archive it. Groups of users can be granted access to the conversation by the participants e.g. a meeting between a subset of people in a department may be deemed relevant to the rest of the department. A conversation deemed “public” can be listened to and tagged by others in the enterprise. Figure 1 depicts the functionalities that Echoes offers. 2
  • 3. avaya.com Private conversations Public conversations Archive & Listen & Tag Tag Figure 1: Functional overview of archiving and tagging conversations. Challenges In the long term, Echoes must be integrated into conversational practices in the enterprise. These span several aspects, including, but not limited to, the actual role that conversations play in the accomplishment of work in the enterprise, the creation of a collective organizational memory 1 that a knowledge worker in the enterprise can draw upon and the issue of privacy. We concede that recording audio conversations presents a thorny privacy issue, despite the potential advantages of doing so such as catching up on missed meetings and sharing relevant snippets of meetings with people. It is our belief that as users in enterprises begin to see the benefits of persistent audio conversations, any adverse reaction to recording conversations will be mitigated, provided, of course, that robust privacy safeguards are implemented. A full deployment of Echoes will include authentication and authorization controls. Participants will control access to recorded conversations using privacy setting options. We have identified the following immediate challenges that the interface must address. 1. Users must be able to query for conversations. Querying could be on the basis of people (i.e. users) or topics (i.e. tags, descriptions etc). 2. Users must be able to tag a conversation, a group of conversations or part of a conversation. 3. Users must be able to listen to a conversation as well as access all relevant information about it such as the names of the participants, tags, and taggers. 4. Users must be able to use the interface to deduce histories and interactions between different participants, and their relationships with different topics, and with each other. 3
  • 4. avaya.com We have attempted to address the above challenges in our current visualization. We present Echoes in detail in the next section. Echoes We have implemented the conferencing part of our preliminary system on the Avaya Conferencing Bridge (Meeting Exchange). A registered user is assigned a host/participant code and can join a conference call using a single access number. To record a call, the host presses *2 and then inputs a 4 digit file-number that will be used to name the audio recording. The audio recording along with the names of the participants is then imported into our database. We have designed and developed a web-based visual interface for tagging and retrieving conversations that are conducted on the conference bridge. The over-all visual interface is shown in Figure 2 (See Figure 8 at the end of the paper for a larger-sized image). It consists of three sub-views: (a) the Social Network View, (b) the Details View and (c) the Conversation List View. The Social Network View allows a user to figure out interaction histories. The Details View reveals details about a particular conversation, tag or user. The List View enables a user to view lists of conversations. Social Network Details View View List View Figure 2: Visual Interface and its three component views The rationale for these three views is as follows. People are familiar with ‘Lists’ in other persistent conversation channels such as emails and IM. Lists can be scanned quickly and can be sorted. The lower-right List View provides this functionality. The upper-right Details View functions like the “preview” panel in email clients like Outlook. It provides the details of a selected conversation (in terms of participants, tags and taggers). The selected conversation can be tagged through the text field provided. The Details View also provides details for a selected Tag or User. We attempted to do something slightly different in the Social Network View. The essential content of this view is the same as that of the List View: a set of conversations. However the representation is different: here, we attempt to show the relationships between various conversations, tags and users. We have suggested a computational model for tagged conversations elsewhere 5, which is reflected in the social network view. 4
  • 5. avaya.com The Social Network View is shown in Figure 3 (see Figure 8 for a detailed view). On the top left, a user can submit keywords such as the names of users or tags. A set of conversations matching the keywords is retrieved and displayed. Conversation Particle Search Box Query Particle Figure 3: Features of the Social Network View The conversations are represented as particles freely moving within the space, with variations in the color tint, indicating their “age”. Recent conversations are bolder while older conversations appear faded. Details of a conversation including the audio recording of the conversation can be easily obtained by clicking on a particle. The Details window (upper-right) refreshes when a particle is clicked, displaying the conversation details as well as playing the audio file associated with that conversation. Along with the conversations, the Social Network View displays users and tags at the bottom: these users and tags are associated with the conversations returned by Echoes in response to the query. When one of these users or tags is selected, a “query particle” is introduced into the search space (see Figure 4). Conversations that are associated with this query get attracted to it. A conversation is associated with a user if (a) she has participated in it, or (b) she has tagged it. A conversation is associated with a tag if the tag has been applied to the conversation. Lines are drawn between queries and their associated conversations. To indicate different associations (i.e. participation and tagging) the lines are rendered in different colors. 5
  • 6. avaya.com Query Ajita Introduced Shreeharsh Conversations introduced related to both queries Figure 4: Steps taken to refine a query When multiple queries are introduced, the conversation particles re-arrange themselves so that they are closest to the queries they are related with. Conversations are color coded depending on how many queries they are related to (In Figure 4, blue, green, and orange particles are associated with 1 query, 2 queries, and 3 queries, respectively). Along with the rearrangement of the conversations in the Social Network View, the List View also updates itself when queries are introduced. The conversations in the List View get sorted and color-coded depending on the number of queries they are associated with. We now consider some scenarios to illustrate how users may utilize the visual interface. While these scenarios are not exhaustive with regards to the types of things that users may use the interface for, they will help put our visualization into perspective. 6
  • 7. avaya.com Finding the People to Consult on a Particular Topic Adithya, a new employee, wants to find the identities of people working on social networks research in the enterprise. He types “social networks” and about 100 conversations appear in the search space. From the row of users at the bottom, he sees that Ajita is associated with more conversations than any other user. When he clicks on Ajita (see Figure 5), the user “Ajita” is introduced into the Social Network view and a large number of conversations cluster around her. To find out more, he adds two more people to the search space: Shreeharsh and Reinhard (two names he recognizes from his limited time working in the department). He now sees that most of the conversations that Shreeharsh has had on “Social Networks” are with Ajita and furthermore, Ajita and Reinhard seem to have a number of conversations between them. He adds user “Doree” into the search space and sees that Ajita has had conversations with Doree on Social Networks too. Adithya infers that Ajita is probably an authority on Social Networks. He listens to some of the conversations that Ajita has participated in and understands that other colleagues have consulted her for advice. He saves Ajita’s contact (right-clicking on Ajita pulls up her contact information) for future use as he might need to discuss his research plans on Social Networks with her. He also notes that the best way to contact Ajita would be through Reinhard or Shreeharsh, both of whom he knows. Note that the visualization can be enhanced with communication to allow Adithya to access interfaces to send Ajita an email/IM or call her through a softphone. Figure 5: Social Network View indicating that the user Ajita may be an authority on the topic “Social Networks” as three other users converse with her regularly on that topic. In another scenario, Mary is preparing for an important presentation to her new client, Company X. She wants to make her presentation more effective by aligning her points with her client’s core business interests. She decides to search for a community that is interested in Company X. related to their work with this client. 7
  • 8. avaya.com Company X Figure 6: Details View and Social Network View depicting top users who are associated with the tag “Company X”. She types “Company X” in the main search text field; she sees that the tag has been used before by quite a few people. The Details View shows that the tag “Company X” has been used most frequently by two people. She introduces them as well as the tag “Company X” as queries in her search space and finds they have archived discussions related to their work with this client. Mary wonders if that they could give her some valuable feedback on her presentation and give insights that might help while working with Company X. She requests a short meeting with them to discuss her work and presentation (see Figure 6). Discovering Hidden Connections in the Community In another scenario, a user John, who has archived several conversations, types his own name in the keyword search to retrieve them. A row of associated people appears at the bottom of the search space. John sees the names of a few of his colleagues that he regularly meets with. Additionally, John notices a user – Lynne - whom he does not recognize. Curious to see what his relationship with Lynne is, he introduces two users into the search space, John (i.e. himself) and Lynne. A quick glance at the visualization tells him this: Lynne has been actively tagging several of his conversations (see Figure 7). 8
  • 9. avaya.com Figure 7: Social Network View showing Lynne has tagged many of John’s conversations, indicating that they may have common interests. John finds out (from the tags) that these conversations are mainly regarding a technical problem John’s group has been grappling with. Perhaps Lynne’s development team has been having similar problems and could provide John with valuable input. John sends Lynne a short email and her reply confirms John’s suspicion. They share their documentation and links and decide to arrange a joint group meeting between their corresponding groups to tackle the problem. Results When the first prototype of Echoes was completed, we conducted a small pilot study with 4 users to test the effectiveness of our visual interface. We simulated conversation data by constructing patterns of inter- communication and tagging. The questions in the study were about the relationships that our subjects were able to observe between users, tags and conversations. We were more interested in the methods used to arrive at the answers rather than the accuracy of the answers (e.g. how they queried for conversations, what sub-queries did they use etc). For this reason we asked our subjects to provide an account of how they went about answering every question. To supplement this account, we also recorded their interactions with Echoes using a screen-capture program. We asked our subjects four sets of questions with 2 to 4 questions in each set. Questions ranged from simple ones like “How many conversations did Jane participate on XX/XX/XX?” to slightly difficult ones like “Can you estimate how many conversations Jack participated in between XX/XX/XX and XX/XX/XX”. The response we obtained from the study was extremely encouraging. The participants began to try and develop strategies while using a combination of sub-queries and search keywords to obtain appropriate conversations. What became clear was that many of our questions would have been difficult to answer without the Social Network View and our subjects tended to interact with this view the most. The “counting” questions (see above) were the hardest since some of them did not have obvious answers; in this case, our subjects tended to ask questions about some 9
  • 10. avaya.com facet of the interface (what does this query do? what does this line mean?), thus revealing their pre-conceptions about some aspect of the interface. This revealed to us which aspects of our design worked well and which did not. More importantly, our subjects gave us valuable suggestions. In our interface, people are associated with conversations depending on whether they are taggers or participants, however the study showed that a clearer distinction (between a person who simply tagged a conversation versus someone who participated in it) was preferred. The study also showed that there is a need to emphasize who applied a tag rather than the tag alone, since the application of the tag indicates something of interest about the tagger. The feedback and the screen capture will help guide work in the future. We then selected five teams from our organization, and with their permission, started recording their scheduled audio conversations. All these five teams were distributed, with members from all over the US (and in one case, with members from China). Our data collection is essentially an ongoing process and we hope to add more teams and continue to add audio conversations to our repository, even as we add more features to Echoes. To further enrich the content, we have added 15 podcasts about topics related to the meetings to the Echoes system. We conducted a second user study with a team of 5 members for a period of five days. We asked them to participate in a game of “tag”. The purpose of the game was to get them to tag the audio conversations and comment on tags that others had used. Each member’s tags were visible to others in the same group. We asked the users to tag their own meetings as well as some of the podcasts (total of 20 meetings). 252 tags were contributed in the study. We performed a detailed content analysis of the collected tags. We first looked at whether the tags could have been obtained automatically, using ASR techniques such as word-spotting. Word-spotting requires a vocabulary of standard terms. Less than 10% of our tags were standard (as checked against Wikipedia). Most of them, on the contrary, were “hybrid”. By hybrid tags, we mean tags like “Microsoft bids for yahoo” “discussionabout-proposed- slideset” – phrases that do not occur in their literal form in the audio. It is evident that tags contributed by a community have great value as they capture concepts within the audio that could not have been easily extracted through programmatic means. Next, we classified the tags that users contributed in the study. Remember that no restrictions were imposed on the types of tags that users could come up with, and no recommendations were given. The tag classification is built around two aspects: 1. The affordances the tags give to the audio: We looked at the affordances that the tags confer on the audio. We found affordances such as description, summary, comments, markers (i.e. marking a specific point or points on the audio timeline, etc), and turn-taking. We refer to this aspect as the “function” of a tag. These affordances reveal that tags can be used for fine-grained searching within audio and answer questions such as “what were these meetings about?” or “what did user X say about topic Y in that presentation” etc. 2. The entities that the tags refer to: The tags invariably seem to refer to different entities (such as people, groups, technologies, activities, events, organizations, etc) which are related to the conversation (which may or may not be mentioned in the conversation itself). By connecting the conversation to this wide network of related entities, the tags serve to index the different conversations, and relate them to each other. They also make the conversation more accessible since a number of keywords now lead to specific conversations. 10
  • 11. avaya.com Conclusions and Future Work In this paper, we presented Echoes a system that allows users to annotate, search, browse, and retrieve past conversations, to use as resources for individual and collaborative work. The visual interface in Echoes depicts relationships between users, tags and conversations as a social network. We conducted a pilot study to test the effectiveness of our social network visualization and to study how users utilize the resources provided by the interface (querying, browsing) to accomplish certain tasks. Analysis of tags contributed for actual enterprise meetings in subsequent studies showed that the tags can help in audio search and retrieval and perform functions such as describing and summarizing the audio. A number of interface augmentations are currently being designed to aid functionality of the system. Some of these are the ability to initiate telephone conversations through the interface and audio skim through long conversations. We plan to actively deploy our system for persistent conversations. Particularly, we want to see how users adapt it to their use, to accomplish their day-to-day tasks. Feedback from the users can tell us how effective the system is and suggest pointers for improvement of its design. We also plan to work further on a computational model for tagged conversations, which can help us compute aspects of conversations like relevance to a certain query, for a certain user or community. References 1 Ackerman, M. S. and Halverson, C. (1999). Organizational Memory: Processes, Boundary Objects and Trajectories, Hawaii International Conference on the System Sciences, Hawaii, USA. 2 Erickson, T. (1999). Persistent Conversation: An Introduction. Journal of Computer-Mediated Communication: Special Issue on Persistent Conversation 4(4). 3 Golder, S. and Huberman, B. (2006). Usage Patterns of Collaborative Tagging Systems. Journal of Information Science 32(2): 198-208. 4 John, A. and Seligmann, D. (2006). Collaborative Tagging and Expertise in the Enterprise, WWW 2006: Collaborative Tagging Workshop, Edinburgh, Scotland. 5 John, A., Kelkar, S., Peebles, E., Renduchintala, A. and Seligmann, D. (2007). Collaborative Tagging and Persistent Audio Conversations, European Conference on Computer Supported Cooperative Work (ECSCW 2007): Workshop on CSCW and Web 2.0, Limerick, Ireland, 25th September 2007. 6 Kelkar, S., John, A. and Seligmann, D. (2007). An Activity-based Perspective of Collaborative Tagging, International Conference on Weblogs and Social Media, Boulder, Colorado. 7 Marlow, C., Naaman, M., Boyd, D. and Davis, M. (2006). Position Paper, Tagging, Taxonomy, Flickr, Article, ToRead., WWW 2006: Collaborative Tagging Workshop, Edinburgh, Scotland. 8 McAfee, A. P. (2006). Enterprise 2.0: The Dawn of Emergent Collaboration. MIT Sloan Management Review 47(3): 21-28. 9 Orr, J. E. (1986). Narratives at Work: Story Telling as Cooperative Diagnostic Activity, Computer-Supported Cooperative Work, Austin, Texas. 10 Tucker, S., Bergman, O., Ramamoorthy, A., Whittaker, S. (2010), Catchup: A Useful Application of Time-Travel in Meetings. Proceedings of the 2010 ACM Conference on Computer Supported Cooperative Work (CSCW 2010), Savannah, Georgia, February 6-10, 2010. 11 Kalnikaité, V., Whittaker, S. (2008), Social summarization: Does Social Feedback Improve Access to Speech Data? Proceedings of the 2008 ACM conference on Computer Supported Cooperative Work, San Diego, CA, November 8-12, 2008. This paper was created by Avaya Labs authored by: Ajita John, Shreeharsh Kelkar, Adithya Renduchintala, Dorée Duncan Seligmann 11
  • 12. Details of the Search Options List of Clickable People and Tags Details of the Tag Field and Audio Information displayed on moving player the mouse over a conversation Figure 8: The full screenshot of the interface. The Social Network View is on the left, the Details View of the selected conversation (orange in the Social Network View) on the top right, and the List View is on the bottom right. About Avaya Avaya is a global leader in enterprise communications systems. The company provides unified communications, contact centers, and related services directly and through its channel partners to leading businesses and organizations around the world. Enterprises of all sizes depend on Avaya for state-of-the-art communications that improve efficiency, collaboration, customer service and competitiveness. For more information please visit www.avaya.com. © 2010 Avaya Inc. All Rights Reserved. Avaya and the Avaya Logo are trademarks of Avaya Inc. and are registered in the United States and other countries. All trademarks identified by ®, TM or SM are registered marks, trademarks, and service marks, respectively, of Avaya Inc. avaya.com All other trademarks are the property of their respective owners. Avaya may also have trademark rights in other terms used herein. References to Avaya include the Nortel Enterprise business, which was acquired as of December 18, 2009. 03/10 • MIS4482