Slides from a presentation given by Holly Large, Emma Sewell (in absentia) and Dr Chris Willmott at the launch of our guide on the use of BoB ("Box of Broadcasts" and TRILT (the Television and Radio Index for Learning and Teaching) as tools for academic research. The launch event took place in London on 23rd September 2022.
Instructions for Submissions thorugh G- Classroom.pptx
BoB and TRILT for Research
1. How to Use BoB and TRILT
for research
Holly Large, Emma Sewell
& Chris Willmott
University of Leicester
2. • Role of popular media in conveying information,
understanding and shaping worldview
• Print media served by online tools such as (Lexis)Nexis
and Factiva
• TRILT & BoB as “boundaried collection” for
analysis of Broadcast media
Importance of media
3. • Television & Radio Index for Learning and Teaching
Established June 2001
• BoB - “Box of Broadcasts”
Streamed copies of TV & Radio
• BoB now contains >3 million programmes
• Both are searchable, with capacity to export data
• Examine content and context – e.g. visual framing,
music, continuity, interstitials and adverts
TRILT and BoB
4. • Clarity of reasoning for criteria crucial
- scope of project, justification to readers, reviewers, etc
• Dates - hinged around key event(s)?
- defined period of time?
- database infrastructure?
1st August 2016 = Core Channels
• Core TV Channels: BBC One (London), BBC Two, ITV1 (London),
Channel 4, Five, BBC4, More4, BBC3 [when broadcast]
April 2020 + BBC News24, Sky News
March 2022+ Al Jazeera
Setting parameters for project (1)
5. • Format - TV, radio or both?
Core radio (from 1st Aug 2016) = Radio4
Radio4 Extra
NB Radio does not have transcripts
• Genre - Documentary? Drama? etc
Setting parameters for project (2)
7. • PhD project - Representations of cancer genomics
and personalised medicine in UK broadcast media
• Still popular medium despite social media – avg.
2 hrs 34 mins* of broadcast TV consumed daily
• Significant tool for communication of science to
public
• Aim to generate set of programmes for qualitative
analysis - using methodology in guide
*July 2021 - https://www.barb.co.uk/viewing-data/weekly-viewing-summary-new
Case study: background
9. • Parameters:
• 5-year date range
• TV and radio
• 228 search terms – used Boolean operators
Case study: finding programmes
10. • BoB & TRILT have important
differences:
• Use different metadata – creates bias
towards/against media type
• Exporting of results – automated vs.
manual, potential for web scraping
• Re-broadcasts can appear in results
of both – abridged versions, ‘Sign
Zone’
• Boolean operators
Case study: finding programmes
11. • Differences in Windows vs
Mac import of CSV files
• Possibility of repeats:
• Similar search terms can
yield same results
• Abridged & Sign Zone
versions
Methodology: collating & refining
12.
13. • Possibility of repeats
• Similar search terms can yield same results
• Abridged & Sign Zone versions
• Excel’s ‘Find & Replace’ tool can be used to identify &
remove repeats using unique Programme ID
• Not flawless – abridged & Sign Zone versions have
separate IDs, must be identified manually, simulcasts
(BBC One & BBC News 24)
Methodology: collating & refining
14. • Search terms yielded:
• 58 results from TRILT
• 2469 from BoB
• After initial repeat exclusion, left with 339 results
• Incl. abridged versions
• Repeats on different channels e.g. simulcasts on BBC One & BBC News 24
• Depends on nature of project as to whether these are removed
Results… so far
15. • Using inclusion/exclusion criteria
• Choosing a method for
programme analysis –
quantitative or qualitative?
• Conduct audience research?
Next steps
16. Thank you for
listening –
any questions?
Personalised medicine in the NHS
‘GenerationGenome’ -
https://tinyurl.com/ydf7z8wq
‘Genome UK: The future of healthcare’ -
https://tinyurl.com/37ujn2nb
Learning on Screen
TRILT – https://tinyurl.com/2uflu22
BoB - https://tinyurl.com/ybzua7fc
Editor's Notes
Boundaried collection = a relatively constrained and consistent collection of material within a defined archive, as opposed to the results of a general web search, or indeed a search of, for example, the IMDb.com database, where a diverse range of sources will be identified, only some of which can actually be accessed
As suggested in the guide, I decided to use both BoB and TRILT for various reasons – more in next slides
Chris mentions parameters earlier on – these are the ones I’ve used as an example. Date range reflects release of seminal paper on the topic, rich science programming in both TV and radio, used Boolean operators as I needed programmes with content on both cancer AND genomics.
A significant difference is that they search for programmes using different information – BoB searches transcripts, which biases it against radio programmes, whereas TRILT uses metadata such as programme title and synopsis, hence why it’s appropriate to use both tools if you are wanting to study both television and radio programming. In the case of my project, science programming is rich in both TV and radio, so wanted to make sure all representations included. If you do want to stick to one type of media, however, there is an option to search for ‘TV only’ etc
Exporting results – guide goes into this in more detail, but essentially is a case of a slightly more automated vs a manual method. In TRILT, I had a relatively small number of results that could be exported into an Excel-ready file. As mentioned in the guide, if you have more results, you may have to adopt a slightly more manual method. With BoB, I manually copied the programme information I required into the spreadsheet – looking into possibility of using web scraping/coding to make this process more automated.
Rebroadcasts can still appear even if you select ‘show only latest broadcast’ – I’ve found that with Radio 4 programmes, you can get full versions and abridged versions broadcast at a later time, both count towards results. Same with BBC Sign Zone.
Boolean operators can be selected in TRILT but have to be typed in BoB
There are differences in windows vs mac import of csv files into excel, but this is discussed in more detail in the guide – however, the process is relatively similar.
After importing results, it’s important to note that the number of results you have is not necessarily a true reflection of the number of unique programmes. Multiple search terms may yield similar results, meaning there can be a lot of duplicates within the collated results.
Make use of Excel’s find and replace tool - knowing that each programme has a unique ID – with some exceptions that I’ll mention shortly – you can use Excel’s find and replace tool to identify and then easily remove repeats.
Again, there are slight differences in what the tool looks like and how it works, but these differences are discussed in more detail in the guide. In essence, the tool helps to highlight repeats of the programme using Programme ID and these repeats can then be removed as you see fit.
As I mentioned, this isn’t foolproof – abridged and BBC sign zone programmes have different IDs, so would need to be identified by searching for the programme title, for example. I’ve also encountered simultaneous broadcasts of the same programme on different channels – usually BBC News programmes on BBC One and BBC News 24 – these have different IDs.
Less referring to my own project here and more giving some suggestions as to what you can do once you have a sample.