Ce diaporama a bien été signalé.
Nous utilisons votre profil LinkedIn et vos données d’activité pour vous proposer des publicités personnalisées et pertinentes. Vous pouvez changer vos préférences de publicités à tout moment.

SLAS Compound Combination Screening SIG: SLAS2016 Meeting Notes

613 vues

Publié le

There was lively discussion among the 35 attendees of this first-ever meeting of the new SLAS Special Interest Group in San Diego. Join with us in the LinkedIn forum at https://www.linkedin.com/groups/Compound-Combination-
Screening-SIG-SLAS-8421958/about.

Publié dans : Sciences
  • Soyez le premier à commenter

SLAS Compound Combination Screening SIG: SLAS2016 Meeting Notes

  1. 1. Findings at Onsite Meeting, SLAS 2016 Compound Combinations SIG 1 Presenters: Stephan Steigele & Oliver Leven (chair)
  2. 2.  SIG’s mission: The mission of the SLAS Compound Combination Screening SIG is to create a knowledge-sharing forum for screening practitioners active in the field of compound combinations. As such, the goal is to mature the field of compound combination screening, aimed at better science that accelerates the pace of drug discovery. – No interest to change mission as of now  Current chair: Oliver Leven, Genedata  Co-chair: Rajarshi Guha, NIH – Volunteer for chair or co-chair are welcome Initiating the SIG / Logistics
  3. 3.  Online forum to exchange ideas and questions around combination screening  Currently 65 registered members  Rajashi will post interesting publications – others are welcome to post as well  https://www.linkedin.com/groups/Compound-Combination- Screening-SIG-SLAS-8421958/about Linkedin Group 3
  4. 4.  Some numbers from an informal query  Total: 35 attendees  Background / interest – More than 50% research scientists, majority active in the field of combination screening – About 10 persons from vendors (pipetting, software) – About 25% of the attendees have practical experience in the field – About 25% are interested because they might have to start combination studios at some point Meeting Statistics 4
  5. 5. 1. Typical experimental designs and their applications in different therapeutic areas 2. Best practices in experimental designs for achieving better results (i.e., self-cross) 3. Analysis options for the different types of experiments 4. Result quality assurance, standardization and comparability 5. Combination screening panels: systematic analysis of combination effects across cell lines within one experiment Agenda for SLAS 2016
  6. 6.  Checkerboard / full matrix – Require a lot of resources  Fixed Ratio / Ray – Pro: High efficiency: An attendee reported studies showing 98% accordance in findings for full matrix vs. singe ray experiments – Con: Risk to miss effects if these are off the diagonal  Mentioned but not discussed – “Sparse” matrices – “Curve shift” design Experimental Designs
  7. 7.  Combination screening: look for new, promising combinations (typically n compounds with know effect vs. m compounds with assumed activity)  Target validation / chemical genomics: pin point pathways with known compounds across different cell lines  Investigate compound behavior in combination for mechanistic studies; to drive SAR (one compound has to be fixed for this) Application areas
  8. 8.  Screening technology – Most experiments run with simple, stable screening technologies (luminescence, fluorescence, …) – HCS – but only single feature. HCS adds additional noise, which makes the measurement of synergetic effects more difficult  Dispensing technology – Pipetting (only used by a few) – Acoustic (significant investment, but needed for any higher throughput) – Piezo technology (flexible, but only for lower throughput) Technologies used
  9. 9.  Best practices discussed – Self crosses / sham controls – Replicates – Cell line specific controls  A balance between throughput and understanding of replicability has to be found – Running self crosses and at least duplicates is recommended, but they add to the overall effort. – If run at (very) high throughput to screen for synergetic effects, combinations without effect can be used to judge error and reproducibility – so no replicates or self crosses would be needed (single person’s opinion) Best practices in experimental designs for achieving better results
  10. 10.  No findings / no discussion Analysis options for the different types of experiments
  11. 11.  Known issues – Single Ray designs might miss synergetic effects (if off-diagonal) – Synergy scores may be dependent of matrix size (depending on the actual calculation) – How to compare results from different experimental setups Result quality assurance, standardization and comparability

×