Durg CALL GIRL ❤ 82729*64427❤ CALL GIRLS IN durg ESCORTS
11 06-28 jisc-well wbl toolkit presentation uo w
1.
2.
3.
4.
5.
6.
7.
8.
Editor's Notes
Start with an overview of work-based learning at Westminster 1 st to say – we don’t have the disciplines most often associated with WBL – medical professions, teaching, engineering (except for computer science We DO have a range of professions where 1 st degree is not professional entry – architecture, Law, language professions And newer professions – Media, and creative arts, computer science to some extent, And business. Here you might say the courses run in parallel with professional working – use a range of wbl, but people in work come to the “course as course”, rather than as WBl learners in significant numbers – or in The relationship with industry tends to be focussed on CPD and M-level – at u/g level relationships tend to support the u/g course – placements VLs etc, rather than collaborative provision and workforce development. THE internal culture tends to define WBL in terms of what we do – placements, CPD – this is something I’ll come back to – whether it’s a problem or not
The areas of focus we chose to work on were INSTITUTION, SCHOOL and some of the criteria around PARTNERSHIP WORKING. The reason for this was because we wanted to work outside the academic community, and get some insight into how the service departments thought about these kinds of learners and what they saw as their needs / issues. To get a cross-cutting overview We knew /know that Westminster doesn’t have highly developed practice – developments tend to be individual and opportunistic development often related to a particular academic or relationship. So – we were looking for the perspectives across the service departments , and with the academic perspective coming in at the School level. The School chosen was Computer Science, as this is the School that will be involved with the Change Academy project, but also because it is School that has been recently restructured and its thinking about this agenda. We also were / are using the pilot to get a benchmark for a Change Academy project the university has around developing and embedding an accreditation service and associated mechanisms to position ourselves for higher levels of employer collaborations Looking back – we didn’t include MARKETING, and I think this is an area that perhaps needs to be developed within the Toolkit GENERAL – we knew / know Westminster is not mature in WBL
We selected six of the criter ia from the Institutional readiness and integrated key points from partnership working and two of the School / faculty-specific criteria. A lot of the faculty criteria re-iterate the institutional criteria. So presented 8 criteria to the participants. Participants were identified for their section –roles and at a level of seniority to have an overview, and to be involved in any discussion / decision-making that had / would be going on As we envisaged the pilot as a planning tool for a later project, We emailed files rather than using the on-line tools to keep the focus we wanted. We didn’t refine the self-assessment criteria or evidence to look for, but left these to inform their thinking around the level of maturity. Where we combined similar criteria from different areas we compounded the other sections as well. SO – we didn’t follow the toolkit methodology beyond the first stages and this was because we saw the need at Westminster was to gauge understanding and whether people considered WBL readiness to be a “problem in need of a solution” – or not relevant.
we decide to use interviews to ensure that participants had thought through the criteria and questions – ie to replace the levelling workshops , the maturity statements were sent in advance with the benchmarking criteria andthey were encouraged to discuss with their teams in advance of the interview. The interviews ask them to discuss how useful the criteria were for their context, and to critique the format – and findings on this will be discussed later. We also asked people to gauge maturity re their department as well as the institution as a whole . Participants brought anecdotal evidence to their benchmarking The interviews will be used to planning the actions – ie follow up through Change Academy project.
We found that different departments interpreted the purpose of the toolkit differently (is it guidance or objectives for an implementation plan?) – and within this the split was between academic and closely associated sections, and the specific services sections – eg the more academic wanted guidance, but were wary of the toolkit being taken as prescriptive, and even informing monitoring processes – would see this as stifling its usefulness. Finance, student services - the more operational sections just wanted an implementation plan. This suggests that they didn’t see themselves as involved in decision-making. This needs to be taken into account when customising the statements and self-assessment evidence The term WBL was understood differently in different departments (service and academic) for eg – some service depts saw it as what they do for their own staff development On the other hand – different academic depts seemed to have very different understandings, and often beyond tradition WBL employee-learning – students as consultants to industry. Any definitions given need to let the range of understandings come out Ie the issue isn’t whether they understand it or not but how they understand it and how it is relevant across the whole range of contexts that academic departments are working in. All of the participants talked of decision-making by committee most noted that we are all responsible. These points were made as a mattter of course – not for critique. All the service departments wanted a plan None (academic or Service) talked about implementation or how plans might come about – how committee decisions might be operationalised. These are key points that we need to take into account in our action plan
Streamlining : our people found the amount of detail self-assessment points and evidence a bit oppressive – remember that we are talking from a perspective of a way to go, and so this might reflect a mindset of “OH we can’t tick much of this off ” Need to stress that quantity does not equal quality Concern that the evidence and self-assessment guidelines stay as guidelines – and it is stressed that they should not be turning into a monitoring tool Service depts thought that the language was overly academic – written by academics for academics And thought / discussion can usefully be given to what WBL means in your institution We also think our pilot has raised some questions for the toolkit – we were fortunate in that one of the interviewees had a very creative take on WBL – as a computer science course leader working wit open source, but also – consulting, CPD. A difficulty for some of us has been understand WBL when the course is associated with the field the learners works in, but they may all be using it differently – CPD. I would see this as a programme design challenge. A couple of our participants raise the importance of including marketing in anything that is outward-facing – ensuring their full understanding and working wit them in building relationships And finally – do you need a well recognised problem to solve to get the level of buy-in that the full process would involve? One wa through this would be to focus on one area. We suspect that we would have had great difficult in getting buy-in for the wider workshop-base approach