- This examination appeal concerns a patent application for a computer-implemented "privacy-preserving data mining protocol". This involves collecting data from sources, e.g. patient records, and aggregating the data together, as well as removing individually identifiable information. For details, I must refer to the - somewhat unusual - claim 1 below ("the Board had doubts whether claim 1 defined the data processing performed by the various processors in a sufficiently clear manner").
- Turning to inventive step, "it is established case law that non-technical features cannot contribute to inventive step. Therefore, non-technical features may legitimately be part of the problem to be solved (T 641/00 - Two identities/COMVIK), for example in the form of a requirement specification given to the skilled person to implement."
- According to the Board, "de-identifying data, by removing individually identifiable information, and by aggregating data from a plurality of sources, is not technical. It aims to protect data privacy, which is not a technical problem. The problem of data privacy is not synonymous with data security. Data privacy concerns what information to share and not to share (and making sure that only the information that is to be shared is shared), whereas data security is about how to prevent unauthorised access to information."
- As a comment, a feature is indeed excluded under the Comvik approach if it "contributes only to the solution of a non-technical problem, e.g. a problem in a field excluded from patentability" (GL G-VII,5.4). However, I'm as of yet not convinced that features which are more elegant than "removing individually identifiable information" (e.g. adding carefully calibrated noise data) should be excluded for inventive step.
- As a further comment, the Board could have explained in more detail why data privacy is not a technical problem. The Board's observation that it is not synonymous with data security, is, of course, correct as such but does not fully exclude that it is a technical problem; clearly, data security is not the only technical problem. In the same way, the observation that "data privacy concerns what information to share and not to share" does not seem to provide fully conclusive reasoning, in particular, it seems to assume some rule or case law stating that considerations about sharing information are never technical (i.e. the major premise in the syllogism is stated at least not explicitly).
VI. Claim 1 of the main request reads:
A Privacy Preserving Data-Mining Protocol, operating between a secure "aggregator" data processor and at least one of "source-entity" data processor, wherein the "aggregator" and the "source-entity" processors are interconnected via an electronic data-communications topology, and the protocol includes the steps of:
A) on the side of the "aggregator" processor:
(i) from a user interface--accepting a query against a plurality of the predetermined attributes and therewith forming a parameter list,
(ii) via the topology--transmitting the parameter list to each of the "source-entity" processors,
(iii) via the topology--receiving a respective file from each of the "source-entity" processors,
(iv) aggregating the plurality of files into a data-warehouse,
(iv[sic]) using the parameter list, extracting query relevant data from the data-warehouse,
(vi) agglomerating the extract, and
(vii) to a user interface--reporting the agglomerated extract; and
B) on the side of each processor of the at least one "source-entity" processors:
(i) accumulating data-items wherein some of the data-items have privacy sensitive micro-data,
(ii) organizing the data-items using the plurality of predetermined attributes,
(iii) via the topology--receiving a parameter list from the "aggregator" processor,
(iv) forming a file by "crunching together" the data-items according to the parameter list,
(v) filtering out portions of the file which characterize details particular to less than a predetermined quantity of micro-data-specific data-items, and
(vi) via the topology--transmitting the file to the "aggregator" processor.
IX. Claim 1 of the third auxiliary request adds to the second auxiliary request:
"which may include specific individuals' names or IDs" at the end of feature A)(i);
"including de-identified results" after the word "file" in feature A)(iii);
"wherein 'crunching together' includes de-identifying results by aggregating results to a tabular report" at the end of feature B)(iv).
3. Main request, inventive step
3.1 The examining division found that the data mining protocol in claim 1 was an administrative scheme, which, when considered on its own, constituted excluded subject-matter according to Article 52(2) and (3) EPC. The examining division could not identify any technical problem solved by the data mining. In the examining division's opinion, the aim of the data processing was rather to comply with legal requirements.
3.2 The Board shares the examining division's view that de-identifying data, by removing individually identifiable information, and by aggregating data from a plurality of sources, is not technical. It aims to protect data privacy, which is not a technical problem. The problem of data privacy is not synonymous with data security. Data privacy concerns what information to share and not to share (and making sure that only the information that is to be shared is shared), whereas data security is about how to prevent unauthorised access to information.
3.3 It is established case law that non-technical features cannot contribute to inventive step. Therefore, non-technical features may legitimately be part of the problem to be solved (T 641/00 - Two identities/COMVIK), for example in the form of a requirement specification given to the skilled person to implement.
No comments:
Post a Comment
Do not use hyperlinks in comment text or user name. Comments are welcome, even though they are strictly moderated (no politics). Moderation can take some time.