Ce diaporama a bien été signalé.
Le téléchargement de votre SlideShare est en cours. ×

ISWC 19 - On the Use of Cloud and Semantic Web Technologies for Generative Design

Chargement dans…3

Consultez-les par la suite

1 sur 36 Publicité

Plus De Contenu Connexe

Diaporamas pour vous (20)

Similaire à ISWC 19 - On the Use of Cloud and Semantic Web Technologies for Generative Design (20)


ISWC 19 - On the Use of Cloud and Semantic Web Technologies for Generative Design

  1. 1. 1/35© 2019 Autodesk, Inc. On the Use of Cloud and Semantic Web Technologies for Generative Design Daniel Mercier, PhD Ali Hashemi, PhD
  2. 2. 2/35 Outline  Context  Data unification  Data validation  Data organization  Conclusions
  3. 3. 3/35 Context
  4. 4. 4/35 Autodesk CAD Computer Aided Design Manufacturing Architecture & Construction Games & Movies
  5. 5. 5/35 Autodesk Rich portfolio
  6. 6. 6/35 The Cloud  New computing paradigm  Unlimited storage  Powerful computing capabilities  Highly resilient and secured (if properly implemented)  Synchronized access from multiple geo-locations across the world  Favors interconnected micro-services Collaborative & powerful
  7. 7. 7/35
  8. 8. 8/35 Simulation Different types of simulations:  Fluid flow,  Structural,  Thermal, … Simulations can be very complex Execution can vary greatly (milliseconds to weeks) Making virtual, real Ref: Autodesk Fusion 360 simulation options Ref: Fire simulation in Autodesk Maya
  9. 9. 9/35 Generative Design Derive from optimization
  10. 10. 10/35 Challenges  Offer a rich set of multi-physic solver  Fast ones for an interactive experience, accurate ones to refine selected designs  Assembling existing solvers and developing new ones  Aggregate the necessary data consumed by these solvers  materials, machines, processes, ...  Composed a schema-based data unification service  Guide the user to compose the initial problem and identify the desired design  Composed a validation and recommendation service  Structure content for optimal storage and performance  Total/partial reuse with history branching  Working on application specific data & metadata template system Create a compatible cloud platform
  11. 11. 11/35 Data unification
  12. 12. 12/35 Historical deployment Encapsulated solutions …
  13. 13. 13/35 Engaged migration Online aggregation of data …
  14. 14. 14/35 Segmented database One solution = Unique language Moldflow Netfabb Revit FeatureCam 材料データへのアクセス …
  15. 15. 15/35 Ontology service To share content Ontology service Moldflow Netfabb Revit FeatureCam …
  16. 16. 16/35 Ontology service Database information Schemas DB registry Ontology service … Moldflow Netfabb Revit FeatureCam
  17. 17. 17/35 Ontology service Content Moldflow Term ↔ Class Molfdlow Term Term Mappings Domain ontology DB registry Term Netfabb Revit Netfabb Term ↔ Class Revit Term ↔ Class
  18. 18. 18/35 White paper “Unified Access to Heterogeneous Data Sources Using an Ontology” Semantic Technology: 8th Joint International Conference, November 2018, Awaji, Japan DOI: 10.1007/978-3-030-04284-4_8
  19. 19. 19/35 Considerations On-demand vs Permanent Dispersed content vs Aggregated content Lossless vs Losses Time consuming vs Fast Next …  Word embeddings targeting specific engineering domains to automate/simplify schema matching Over service uses
  20. 20. 20/35 Data validation
  21. 21. 21/35 Assist during content creation With recommendations JSON orChunks of JSON Creation Process Report & Recommend Validator Generate Validation Successful
  22. 22. 22/35 z Data transfer Internet Cloud Server  Secure communication  Secure message content  Assess data syntax (schema)  Validate data  Consolidate data  Propagate data File storage and Database Compute workflow
  23. 23. 23/35 Data transfer Internet Cloud Server File storage and Database Compute workflow Validator Service meshes Used for validation: • Schema syntax • Descriptive logic (Ontology axioms) • Code logic (Procedural codes) Leverages JSON
  24. 24. 24/35 Knowledge structure Domain ontologies Application model • Application specific • Map to external resources • Versioned • Domain dependent • Portable, Reusable • Object oriented (versioning) e.g. geometry (mesh, polygon, vertices), materials (mechanical, thermal properties) e.g. Autodesk Autocad, Moldflow, Revit, Maya, … Classes used to compose
  25. 25. 25/35 White paper “Validation and Recommendation Engine from Service Architecture and Ontology” 11th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management September 2019, Vienna, Austria DOI: 10.5220/0008070602660273
  26. 26. 26/35 Next … Develop higher intelligence with big data & ML Internet Cloud Server Validator Server Validator Users Subject-matter expert Server Knowledge Repository Big Data ML Init/Update Monitor/Collect
  27. 27. 27/35 Data organization
  28. 28. 28/35 Technology Transformation Pre-processing Processing Software Browser Desktop computer Web-based interface Post-processing DB User interface Cloud Derivatives services for data transformations Server Authentication/Authorization DB Object storage Computation services
  29. 29. 29/35 Data Transformation All in one/several files holds all data potentially heavy favors redundancies versioning not included user organized portable protected by file copies DB Object storage Structured metadata point to data light weight minimizes redundant data versioning integrated automatically organized centralized content global availability protected by block replicas Desktop Cloud
  30. 30. 30/35 Supporting data content & lifecycle  Thousands of studies per project (up to TB of data)  Large histories and extended branching Linked information to define & monitor an application data ecosystem  Listing & location of data  Embedded & associated metadata  Definition of this data lifecycle For Generative Design
  31. 31. 31/35 Conclusions
  32. 32. 32/35 Challenges Using Semantic Web technologies:  Composed a schema-based data unification service  Composed a validation and recommendation service  Working on enhancing the management of data and metadata For Generative Design
  33. 33. 33/35 Knowledge graphs  Source of data cohesion  Map complex concepts  Useful for existing or new applications  Natural bridge between Human and Machine  Integration, application specific  User Interface(UI) / API, critical to abstract inherent complexity To support Cloud operations
  34. 34. 34/35 Semantic Technologies  Good first layer of intelligence using Descriptive Logic, DL and reasoners  Excellent complement to Machine Learning, ML As piece of AI
  35. 35. 35/35 Questions
  36. 36. 36/35 Autodesk and the Autodesk logo are registered trademarks or trademarks of Autodesk, Inc., and/or its subsidiaries and/or affiliates in the USA and/or other countries. All other brand names, product names, or trademarks belong to their respective holders. Autodesk reserves the right to alter product and services offerings, and specifications and pricing at any time without notice, and is not responsible for typographical or graphical errors that may appear in this document. © 2019 Autodesk. All rights reserved.

Notes de l'éditeur

  • What do we care or matter ?

    Deal with the complexity.
    Mean to validate the data, all possible aspects.
    Prevent the launch of lengthy computations when the data is incomplete or incorrect.
    A simple way to create the necessary knowledge to validate the data .. in a modular form so that it can be reuse and recomposed for different applications.
    Report any missing or invalid content with recommendations for quick identification and correction.
    Provide justification as to why validation fail

    Why did we do this work and what is different from what is existing.
    Rule checking is most famous for firewall to secure communications
    Traditional rule systems are usually relatively linear (one line per rule), following a specifc style (XML), with limited validation capabilities and lengthy, hard to read or interpret rule listing.

    We tried to build a complex yet easy to use experience for both the users and the subject-matter expert hat create and maintain the validation knowledge base.
  • Initial problem targeted: Lightweighting.
    But the system can also achieve a practical, performant and good looking geometry.
    The system do so through intelligent interactions and a back and forth with the user to identify the right design for the desired purpose.
  • Lightweight data unification service
    Convert data from multiple source types
    Schema based
    Works using content maps connected by domain ontologies
    Easily deployable with dedicated user interface for mapping creation
  • Show application diversity:
    Netfabb – Additive manufacturing
    Moldflow – Injection molding
    Revit – Architecture
    FeatureCAM – CNC tool path
  • Customers are attached to local desktop deployments on their own machine.
    Migration is in stage:
    Migrate data to facilitate live and regular updates
    Create bridges to Cloud compute capabilities through web based equivalents

  • Making the application to talk the same language would require massive refactoring and entire recoding.
  • Thus the idea to introduce an ontology service as a middleman
  • beneficial to the experience
  • Unification is really plan B as it is time consuming, plan A is to hove all related data in the same media if possible.
    Does not mean all data is one uber place but an intelligent domain separation with an effort to avoid duplications that would require unifications.
  • Lightweight service for validation and recommendation
    JSON based
    Rich set of validation techniques
    with Descriptive Logic and Code Logic
    Well-structured and easy to use validation knowledge base
    by combining Domain ontologies and Application models
    Easily deployable and scalable
  • Stream reasoning strategies
  • Roughly
  • Whether it is for supporting existing applications or creating new intelligent ones, Semantic Web has an incredibly important place, directly inside core applications or within derivative services to structure coordinate and reason over data. In the future, it will certainly be an essential component to build intelligence along with algorithms in Machine learning and technologies sch as Quantum computing.

    Connected to other digital storage -> Capture human knowledge and expertise

    As we have neurons in our brain and Cloud have service architecture. The answer is a mix but both machines and humans have limits.

    Personal opinion mostly based on how manageable the data (structure, lifecycle and monitoring) and who is supposed to manage the data (human, machine).

    Machine = allow larger blobs but beware that without control can become overly time consuming.
    Human = allow only moderate blobs because error prone, but can be protected by validation rules and human reviews.

    Ultimately, data like code beyond certain sizes and increased complexity need restructuring whether managed by machines or humans.