Working Group: Generic Validation service

Description

To agree the useful scope and requirements to ensure that the generic Validation service will prove fit for purpose for the community.

Objectives

1. What should the scope and target be for a generic Validation service
2. What technology platform would be most appropriate
3. What should the high level rules be, for packaging format, content, etc


Original Terminology Release Advisory Group discussion: https://confluence.ihtsdotools.org/pages/viewpage.action?pageId=38245085


Project Group Proposal: 

https://drive.google.com/file/d/19vJPHBfPKyno-jSu5x-O2lLwyRhxd667/view 


Meeting minutes & actions

MeetingMinutesActions

 

Good news is that we now have the baseline Assertion list for the SI internal validation service, including the new automated assertions currently going through final verification before deployment.

I've tried to make it as useful as possible, so it includes not just the description of each assertion, but also what type of assertion it is and what group it currently resides in - plus a link to the relevant SQL file in Git so that you can go and look at the code itself if you need more info than is provided in the description (FYI you might need to copy/paste the URL's out rather than just clicking on them, depending on whether or not you're using Windows or OSX):

SNOMED International Proposed Shared Validation Service Assertion list 20181017.xlsx

Can I please, therefore, ask Chris Morris, Suzy Roy + Matt Cordell/Dion McMurtrie to go ahead with the comparison of these assertions against all of your local validation, and post the results here asap with the following information:

  • Gap analysis - all local assertions that can be promoted up into the new Shared Validation Service
  • De-duplication - all assertions that can be proposed for deprecation in lieu of other existing assertions. Please note that this should only be done if a) the code is identical (logically not necessarily semantically!) and b) there is no possibility of differences in the results of the assertion, for example as a result of being run against the International Edition vs a local Extension, etc.

 

  • The UKTC development project is slightly delayed, and so they are still in the process of asking all of their various teams to compile a complete list of RF2 rules to provide as input to this project.
  • The RVF improvement project is also taking a bit longer than expected, and so we are looking to have the updated version ready for gap analysis before the October conference.
  • We have found several useful rules in the US QA process (as part of the first US MS Releases), and so Suzy's team should also be able to provide us with extremely useful input.
  • Once the RVF improvement project is complete, Andrew Atkinson will inform everyone so that they can begin the full gap analysis
  • Dion McMurtrie, Suzy Roy and Chris Morris to then begin gap analysis
  • Andrew Atkinson will discuss the budgetary sign off required to then start the Shared Validation Service internally, and if we can't get it approved as an internal Tech project, Suzy Roy will try to get it pushed through as a Member Forum request (as several of the MF priority projects were rejected last time).

 

  • It was agreed that we should work towards a goal of creating a shared Validation service with a comprehensive enough coverage to foster interoperability, by ensuring that it would be useful to all those in the community wishing to validate their extensions/editions against an officially approved baseline of rules
  • We therefore need to start off with a base set of rules (based on the existing RVF, DROOLS and MRCM validation), and run a gap analysis against the validation services of Australia, UK and USA, to ensure maximum known coverage
  • Going forward, we can then continually improve the coverage by introducing new rules for any new scenarios that are found to be outside of the existing scope
  • Dion McMurtrie, Suzy Roy and Chris Morris to provide details of any moduleDependency and refsetDescriptor specific validation rules that they currently implement, so that we can add them to the scope of the current RVF improvement project.
  • Chris Morris to conduct a gap analysis between the current version of the RVF and the UKTC's RF1 validation rules (as part of the current UKTC QA analysis), and provide a list of any rules which are relevant and applicable to RF2, that aren't currently in the RVF. Andrew Atkinson to add these into RVF Improvement project scope.
  • Dion McMurtrie, Suzy Roy and Chris Morris to lobby for the relevant resources required to implement the gap analysis in a few weeks' time.
  • Andrew Atkinson to provide an update to the group on the updated version of the RVF, after the current RVF project has been delivered (current target mid-August 2017)
  • Dion McMurtrie, Suzy Roy and Chris Morris to then conduct a gap analysis between the final version of the RVF and each of their relevant validation services, and provide a list of rules that need to be added to the RVF.
  • Andrew Atkinson to verify the consolidated list and provide Rory Davidson with new requirements for the upcoming generic Validation service project.
  • Andrew Atkinson and Rory Davidson to update the group once the generic Validation service (phase 1) is ready for verification.