|GRANTWAY
EN

Agile and robust human language technologies for defence – Organisation of a technological challenge

European Comission

Share
Favorite
Feedback
Summary
22 June 2023
22 November 2023
-
-
-
For profit
Not for profit (incl. NGOs)
Public sector
Aruba
Austria
Belgium
Bonaire
Bulgaria
Croatia
Curaçao
Cyprus
Czechia
Denmark
Estonia
Finland
France
French Polynesia
French Southern and Antarctic Territories
Germany
Greece
Greenland
Hungary
Ireland
Italy
Latvia
Lithuania
Luxembourg
Malta
Netherlands
New Caledonia
Poland
Portugal
Romania
Saba
Saint Barthélemy
Sint Maarten
Slovakia
Slovenia
Spain
St. Eustatius
St. Pierre and Miquelon
Sweden
Wallis and Futuna
Research, Development and Innovation Engineering and Technology
Overview

Objective :

The objective evaluation of artificial intelligence (AI) technologies such as human language technologies (HLT) requires a specific organisation whereby systems are tested in a blind manner on data that is representative of the tasks under study, using common protocols. This scheme, which has been pioneered by the HLT community under the term “evaluation campaign”, is also often called a “technological challenge”. One objective of the call is to organise a technological challenge driving research toward enhanced HTL systems for defence applications.

Scope :

The proposals should address the organisation of a technological challenge on HLT based on the preliminary evaluation plan provided as part of the call document (cf. Annex 4). This includes the collection, annotation and distribution of data, and the writing of the evaluation plans.

Types of activities

The following table lists the types of activities which are eligible for this topic, and whether they are mandatory or optional (see Article 10(3) EDF Regulation):

Types of activities
Eligible?
(art 10(3) EDF Regulation)

(a) Activities that aim to create, underpin and improve knowledge, products and technologies, including disruptive technologies, which can achieve significant effects in the area of defence (generating knowledge) Yes(optional)

(b) Activities that aim to increase interoperability and resilience, including secured production and exchange of data, to master critical defence technologies, to strengthen the security of supply or to enable the effective exploitation of results for defence products and technologies (integrating knowledge) Yes(mandatory)

(c) Studies, such as feasibility studies to explore the feasibility of new or upgraded products, technologies, processes, services and solutions Yes(optional)

(d) Design of a defence product, tangible or intangible component or technology as well as the definition of the technical specifications on which such a design has been developed, including any partial test for risk reduction in an industrial or representative environment Yes(optional)

(e) System prototyping of a defence product, tangible or intangible component or technology No

(f) Testing of a defence product, tangible or intangible component or technology No

(g) Qualification of a defence product, tangible or intangible component or technology No

(h) Certification of a defence product, tangible or intangible component or technology No

(i) Development of technologies or assets increasing efficiency across the life cycle of defence products and technologies No

The proposals must cover at least the following tasks as part of the mandatory activities:

* setting-up of the infrastructures for testing HLT systems in the framework of the technological challenge;
* collection and annotation of data, quality assessment, distribution and curation of databases;
* organisation of the evaluation campaigns, and in particular;
    + coordination of the exchanges with the participating teams and any other relevant stakeholders on the evaluation plans and elaboration of these plans;
    + management of the experimental test campaigns and of the objective measurements of the performances of the systems submitted to the tests by the participating teams according to the protocols and metrics described in the evaluation plans;
    + organisation of the debriefing workshops.

The proposals should include clear descriptions of the proposed criteria to assess work package completion. Criteria should include the production of detailed evaluation plans agreed upon by all stakeholders, the production of the annotated databases needed for the evaluations, the production of measurements for all systems submitted to the tests by the participating teams following these plans, and the organisation of the needed events.

Functional requirements

The proposed solutions should enable the measurement of the performances of HLT systems according to detailed evaluation plans based on the preliminary evaluation plan provided as part of the call document (cf. Annex 4). Key aspects of the foreseen detailed evaluation plans and associated data management should be described in the proposals. The proposals should in particular describe:

* the scenarios considered, and the nature and size of data to collect;
* the languages and dialects that can be covered;
* the nature and volume of data annotation to be produced;
* a framework for trusted sharing of data during the challenge and beyond;
* a detailed plan of the test campaigns and an overall timeline/Gantt chart of the challenge;
* the evaluation procedures (rules and tools to implement the metrics) and significance tests to be performed on measurements.

A user board consisting of representative defence users should be set up and involved in the preparation of the evaluation plans and of the data. Data should be representative of use cases of interest for defence, such as peacekeeping operations, ISR and C2. The proposals should describe the foreseen efforts from users to test demonstrators and provide feedback.

During the challenge, detailed evaluation plans should be prepared for each evaluation campaign. Drafts of these detailed evaluation plans should be submitted for discussion to the participating teams, early enough to take into account the feedback for the actual evaluation campaigns. Any evolution of the evaluation plans should take into account several factors: technical possibilities and cost, scientific relevance of the measurement, and representativeness of the metrics and protocols with respect to military needs.

More generally, the user board and the participating teams should be involved in the steering of the challenge. The proposals should include a clear description of the foreseen governance and decision-making processes.

Expected Impact :

The outcome should contribute to:

* enhanced metrics and protocols to measure progress of R&D on HLT;
* standardisation of testing for HLT;
* enhanced clarity on the performances of HLT systems for all stakeholders, including system developers, funders and users;
* enhanced framework for EU and EDF associated countries (Norway) cross-border collaboration and sharing of linguistic resources, software components, systems and services;
* HLT community building at the European defence level;
* availability of databases to further develop and test HLT systems.
Eligibility
  1. Admissibility conditions: described in section 5 of the call document

Proposal page limits and layout: described in Part B of the Application Form available in the Submission System

  1. Eligible countries: described in section 6 of the call document

  2. Other eligibility conditions: described in section 6 of the call document

  3. Financial and operational capacity and exclusion: described in section 7 of the call document

  4. Evaluation and award:

Award criteria, scoring and thresholds: described in section 9 of the call document

Submission and evaluation processes: described section 8 of the call document and the Online Manual

Indicative timeline for evaluation and grant agreement: described in section 4 of the call document

  1. Legal and financial set-up of the grants: described in section 10 of the call document

Call document s :

Call document

Templates for proposals should be downloaded from the Submission System (available at the opening of the call), the links below are examples only:

Lump Sum MGA v1.0

Additional documents:

EDF Annual Work Programme

EDF Regulation 2021/697

Generic Programme Security Instruction (PSI) concerning European Defence Fund

EU Financial Regulation 2018/1046 Rules for Legal Entity Validation, LEAR Appointment and Financial Capacity Assessment

EU Grants AGA — Annotated Model Grant Agreement

Funding & Tenders Portal Online Manual

Funding & Tenders Portal Terms and Conditions

Funding & Tenders Portal Privacy Statement

Learn more or apply
All information about this funding has been collected from and belongs to the funding organization
22 November 2023