260

Teams

308

Competitors

735

Submissions


KDD 2019 | Policy Learning for Malaria Control

Getting Started in Python


Example starter code*

*Updated 29th April

Please use the following class template as a pattern to encapsulate your submission.

You may modify it extensively, however you must:

  • permit the environment object to be passed in the class initialization, and
  • implement the "generate" method to return the best policy developed by your algorithm. In this template, you are free to add (but not remove) methods, parameters, etc. and also to set defaults for any of your additional parameters which permit your code to perform at its best. Only the generate method will be called during the evaluation process, and no parameters will be passed.

*Updated 1st July

A proving environment is also available for participants to hone their submission before the 6th July Deadline

Previous environments may be replaced with the class:

ChallengeProveEnvironment()

*Updated Submissions:

Note: your XX.zip file should consist of your single code submission titled submission.py which has within it your CustomAgent() class and a Requirements.txt file which may be auto-generated.

Example submission: submission.py

from sys import exit, exc_info, argv
import numpy as np
import pandas as pd

!pip3 install git+https://github.com/slremy/netsapi --user --upgrade

from netsapi.challenge import *

class CustomAgent:
    def __init__(self, environment):
        self.environment = environment

    def generate(self):
        best_policy = None
        best_reward = -float('Inf')
        candidates = []
        try:
            # Agents should make use of 20 episodes in each training run, if making sequential decisions
            for i in range(20):
                self.environment.reset()
                policy = {}
                for j in range(5): #episode length
                    policy[str(j+1)]=[random.random(),random.random()]
                candidates.append(policy)
                
            rewards = self.environment.evaluatePolicy(candidates)
            best_policy = candidates[np.argmax(rewards)]
            best_reward = rewards[np.argmax(rewards)]
        
        except (KeyboardInterrupt, SystemExit):
            print(exc_info())
            
        return best_policy, best_reward

How to create a local score for your CustomAgent()

*not included in submission.py

EvaluateChallengeSubmission(ChallengeSeqDecEnvironment, CustomAgent, "example.csv") #How scoring files generated by participants should *not* be included in submission.py file

FAQ


Is there another environment I can prove my submission on?

A proving environment is available for participants to hone their submission before the 6th July Deadline

Previous environments may be replaced with the class:

ChallengeProveEnvironment()


What Libraries are supported for my submission?


tensorflow

numpy

scipy

keras

pytorch

sklearn

pandas

 


I want to know more about Malaria? 

 

 

What are ITNs and IRS? 

 

Insecticide Treated Nets (ITNs) and Indoor Residual Spraying (IRS) are among the most studied interventions in modelling malaria transmission 

 

Where can I find more information about Reinforcement Learning? 

 

It is a subject covered in terms of our material in the tutorial but you will find extensive reading materials and many useful blogs based on your level and interest. 

 

What malaria setting is being modelled? 

 

For the purposes of this challenge we have developed our own model for an abstract location, to avoid any distraction outside of the KDD cup environment. And preventing any advantage which could be provided from background material.

Policy Learning for Malaria Control

Malaria is thought to have had the greatest disease burden throughout human history, while it continues to pose a significant but disproportionate global health burden. With 50% of the world’s population at risk of malaria infection.  Sub Saharan Africa is most affected, with 90% of all cases.


Through this KDD Cup|Humanity RL track competition we are looking for participants to apply machine learning tools to determine novel solutions which could impact malaria policy in Sub Saharan Africa. Specifically, how should combinations of interventions which control the transmission, prevalence and health outcomes of malaria infection, be distributed in a simulated human population.



This challenge has been framed as a Reinforcement Learning problem, participants are expected to submit high performing solutions to the sequential decision making task. For this competition, actions receive stochastic and delayed rewards, which are resource constrained by both the monetary and computational costs associated with implementing and observing the impact of an action.

 

Submissions are encouraged from participants who may not have previous experience in reinforcement learning problems, reading through the materials we hope you may see the necessity for contributions to inform decision making for this complex real-world problem.


Feedback phase 15th April


·      Learn: Read the Getting started code in Python, Follow Tutorials, Blogs and Engage in the Discussion Forum

·      Test: Run the example code and generate a submission file to post a Public leaderboard score

·      Build: Iterate, Improve and Refine your own code and make sure you’ve scored in the top 100 of the Public leaderboard. Along with a top 100 score you will have to submit your code to proceed to the Check phase.

·      Deadline 00:01 UTC 30th June

 

Check phase 30th June

 

·      Top 100 participants ranking on the Public leaderboard from the Feedback phase will have access to new the Check phase environment

·      Build: There is no Public Leaderboard in the Check phase, the private dashboard remains to continue to Iterate, Improve and Refine your new Check phase submission. To be considered in the Verification Phase you must submit your final Check phase code

·      Deadline 00:01 UTC 7th July

 

 

Verification phase 7th July

 

·      Check phase submissions will be run, validated and verified by the organisers on the Verification phase environment

·      Top 10 Winners of the competition will be generated. And a final ranking/scoring of the Check phase submissions will be posted on the platform.

·      Winners Announced 20th July

 

Reward : $25K (Sponsor KDD)


Timeline (UTC)


Apr 15th, 2019: Competition Launch. Participants can register, join the discussion forum and form teams around the Challenge.


Jun 30th, 2019: End of the Feedback Phase, top 100 contestants migrated automatically to Check Phase, prompted to submit code


Jul 7th, 2019: End of the Check Phase, organizers start code verification.


Jul 20th, 2019: Announcement of the KDD Cup|Humanities Track Winner.


Aug 4th, 2019: Beginning of KDD 2019.




  • About KDD 2019 Conference

    The annual KDD conference is the premier interdisciplinary conference bringing together researchers and practitioners from data science, data mining, knowledge discovery, large-scale data analytics, and big data.

    KDD 2019 Conference:

     

    August 4 - 8, 2019

    Anchorage, Alaska USA

    Dena’ina Convention Center and William Egan Convention Center

     

    About Other 2019 KDD Cup Competitions

    KDD Cup is the annual Data Mining and Knowledge Discovery competition organized by ACM Special Interest Group on Knowledge Discovery and Data Mining, the leading professional organization of data miners. SIGKDD-2019 will take place in Anchorage, Alaska, US from August 4 - 8, 2019. The KDD Cup competition is anticipated to last for 2-4 months, and the winners will be notified by mid-July 2019. The winners will be honored at the KDD conference opening ceremony and will present their solutions at the KDD Cup workshop during the conference. 

     

    In KDD Cup 2019, there are three competition tracks:

     

    1.    Automated Machine Learning Competition Track (Auto-ML Track) 

    2.    Regular Machine Learning Competition Track (Regular ML Track)

    3.    “Research for Humanity” Reinforcement Learning Competition Track (Humanity RL Track) [This Competition]


    About KDD Cup Chairs

    Taposh Dutta-Roy (Kaiser Permanente)

    Wenjun Zhou (University of Tennessee Knoxville)

    Iryna Skrypnyk (Pfizer)

     

Evaluation

Submission method:


Code must be submitted through the competition platform by the deadlines for evaluation. The top 100 participants on the public leaderboard of test submissions will be given access to the final submission phase. Using multiple accounts to increase the number of submissions in NOT permitted. The entries must be formatted as specified on the Competition page. In case of any problems, send email to support@hexagon-ml.com. Competition related questions can be posed at our discussion forum, KDD 2019 Policy Learning For Malaria Elimination.


Awards:


The 10 top ranking final submissions for the KDD Cup|Humanities Track Competition qualify for cash prizes:


1st     $5000

2nd    $4000

3rd     $3000

4th     $3000

5th     $3000

6th     $2000

7th     $2000

8th     $1000

9th     $1000

10th   $1000

 

There is no other publication requirement. The winners will be required to make their code publicly available under an OSI-approved license such as, for instance, Apache 2.0, MIT or BSD-like license, if they accept their prize, within a week of the deadline for submitting the final results. Entries exceeding the time budget will not qualify for prizes. In case of a tie, the prize will go to the participant who submitted his/her entry first. Non winners or entrants who decline their prize retain all their rights on their entries and are not obliged to publicly release their code.

Rules

Announcements:

To receive announcements and be informed of any change in rules, the participants must provide a valid email to the challenge platform


Conditions of participation:

Participation requires complying with the rules of the Competition. Prize eligibility is restricted by US government export regulations. The organizers, sponsors, their students, close family members (parents, sibling, spouse or children) and household members, as well as any person having had access to the truth values or to any information about the data or the Competition design giving him (or her) an unfair advantage are excluded from participation. A disqualified person may submit one or several entries in the Competition and request to have them evaluated, provided that they notify the organizers of their conflict of interest. If a disqualified person submits an entry, this entry will not be part of the final ranking and does not qualify for prizes. The participants should be aware that the organizers reserve the right to evaluate for scientific purposes any entry made in the challenge, whether or not it qualifies for prizes.

Dissemination:

The Winners will be invited to attend a workshop organized in conjunction with KDD and contribute to the proceedings.

Registration:

The participants must register to the platform and provide a valid email address. Teams must register only once and provide a group email, which is forwarded to all team members. Teams or solo participants registering multiple times to gain an advantage in the competition may be disqualified.


Anonymity:

The participants who do not present their results at the workshop can elect to remain anonymous by using a pseudonym. Their results will be published on the leaderboard under that pseudonym, and their real name will remain confidential. However, the participants must disclose their real identity to the organizers to claim any prize they might win. 


Submission Expectation:

  • Your submission is based on 10 instantiations of the environment object
  • Each environment has a limit of 105 evaluations, with an episode requiring 5 evaluations
  • Your submission should include submission of a learnt policy and it's evaluation, in the provided .csv format
  • Best practices and expectations should refer to the example and tutorial code
  • Knowledge should not be shared between runs, any violation of this will be removed in the verification phase
  • Submissions should be reasonably constrained to standard Python libraries
  • If submitted code cannot be run, the team may be contacted, if minor remediation or sufficient information not provided to run the code, the submission will be removed

-->

Leaderboard

Rank Team Score Count Submitted Date
1 Pidgey 999.99993665 2 June 26, 2019, 1:32 a.m.
2 xierui 999.90000000 5 June 11, 2019, 12:14 a.m.
3 unknown 570.70470084 4 June 13, 2019, 10:26 a.m.
4 RL-Hacker 569.30552009 7 June 30, 2019, 6:52 p.m.
5 Alpha 562.96886460 7 June 23, 2019, 9 p.m.
6 NULL 562.30191778 15 June 20, 2019, 3:05 a.m.
7 A^2 558.76327344 12 June 20, 2019, 3:28 a.m.
8 ZCYM 532.74972225 8 June 27, 2019, 2:09 a.m.
9 et373 521.79321004 6 June 14, 2019, 2:53 a.m.
10 ENORMOUS_HAMMER_wang 519.95010369 20 June 19, 2019, 1:14 a.m.
11 LOLs 519.41155060 8 June 26, 2019, 8:33 a.m.
12 jingw2 514.90325475 12 May 30, 2019, 4:12 a.m.
13 Plato 511.50351434 2 June 29, 2019, 4:39 a.m.
14 fddfdd 511.47344054 3 June 19, 2019, 2:33 p.m.
15 Zachary 507.23094890 7 June 19, 2019, 12:21 p.m.
16 Chia 507.23094890 5 June 19, 2019, 12:43 p.m.
17 NCKU CSIE 506.32072879 7 June 19, 2019, 1:08 p.m.
18 NCKU CSIE bill wei 497.55923806 12 June 18, 2019, 5:13 a.m.
19 NTT DOCOMO Labs 497.34066254 12 June 12, 2019, 10:10 p.m.
20 Ironball 490.46992894 3 June 28, 2019, 10:52 p.m.
21 Dejavu 486.40188282 1 June 12, 2019, 11:54 a.m.
22 abhor 465.47705950 6 June 30, 2019, 10:47 p.m.
23 All you need is deep learning 462.76825126 3 May 29, 2019, 6:55 a.m.
24 vlad 461.09484206 6 May 15, 2019, 4:10 a.m.
25 Anand 453.28944377 3 June 29, 2019, 11:22 a.m.
26 Zay 440.33012878 10 June 4, 2019, 8:35 p.m.
27 Paulina 439.80026634 15 May 30, 2019, 6:20 a.m.
28 saite fan 415.94301385 9 June 18, 2019, 12:51 a.m.
29 Charlie 410.68072000 3 June 21, 2019, 12:41 a.m.
30 Testtesttest 400.47680815 1 May 22, 2019, 12:12 p.m.
31 NCKU shallow learning 397.32426500 5 June 21, 2019, 2:12 a.m.
32 Winter_Is_Coming 390.45638831 2 June 28, 2019, 5:52 a.m.
33 MathCo 390.18228645 3 June 30, 2019, 8:52 a.m.
34 Karab 387.84357007 5 May 27, 2019, 8:24 p.m.
35 avnish narayan 377.12720490 2 June 26, 2019, 5:09 p.m.
36 mr sandman 375.83112128 4 June 29, 2019, 4:17 a.m.
37 Tensorflaws 361.22836746 11 June 19, 2019, 8:01 a.m.
38 it bites 355.46414197 1 June 27, 2019, 3:19 a.m.
39 M_P_Irsch 354.13119641 3 June 29, 2019, 9:53 a.m.
40 gnemoto 347.87829606 8 June 8, 2019, 8:31 p.m.
41 El 343.41726569 6 June 26, 2019, 8:09 p.m.
42 A10 326.08872112 5 June 23, 2019, 8:26 a.m.
43 LahaAle 314.80554396 10 June 9, 2019, 4 p.m.
44 yenrabbit 303.16805564 2 June 15, 2019, 11:09 a.m.
45 rohansaphal 301.31876815 8 June 29, 2019, 10:34 a.m.
46 batsy 300.13474520 1 June 26, 2019, 10:22 p.m.
47 ka2yama 296.10032690 3 June 22, 2019, 4:54 p.m.
48 eventhorizon 294.94139730 4 July 1, 2019, 7:27 a.m.
49 Ashish 287.40642741 8 June 27, 2019, 8:12 p.m.
50 Bonum 278.29383148 4 June 24, 2019, 5:45 a.m.
51 NCKU Jerry 270.89203832 3 June 18, 2019, noon
52 DoPlus 265.55637648 3 June 27, 2019, 1:26 a.m.
53 fersebas 262.30656906 2 June 24, 2019, 3:02 a.m.
54 Ratio 234.09241077 2 June 9, 2019, 4:12 a.m.
55 Max Ye 219.29056661 3 June 29, 2019, 8:20 a.m.
56 helloword 215.94301385 1 June 18, 2019, 12:49 a.m.
57 Thuan 195.06313011 3 June 24, 2019, 2:40 a.m.
58 kim 194.30791452 2 June 27, 2019, 5:46 a.m.
59 xinxinxinxinxin 192.13800881 6 May 30, 2019, 7:47 a.m.
60 fightraccoon 184.90006036 1 June 2, 2019, 4:49 p.m.
61 agill 183.49116070 3 June 11, 2019, 1:05 p.m.
62 oetbent 180.83652172 10 June 18, 2019, 2:04 a.m.
63 DeepBlueAI 179.53279228 3 May 21, 2019, 8:41 p.m.
64 mastermind99 176.70695794 7 May 14, 2019, 11:02 p.m.
65 kimuragt 172.53509911 1 May 17, 2019, 7:06 p.m.
66 guest1 167.36329676 4 May 14, 2019, 9:50 a.m.
67 Sherry 167.36329676 2 May 15, 2019, 12:22 a.m.
68 major_tom 167.36329676 2 May 16, 2019, 6:47 a.m.
69 m_nakamura 167.36329676 1 May 16, 2019, 8:02 p.m.
70 KDDCup2019Chair 167.36329676 2 May 23, 2019, 6:12 a.m.
71 deepblue 167.36329676 3 June 10, 2019, 11:48 p.m.
72 jak 167.36329676 1 June 29, 2019, 4:58 a.m.
73 JINGJINGXIAO 166.32405490 1 June 18, 2019, 12:02 a.m.
74 Mosquitos suck! 166.03660169 1 May 31, 2019, 7:29 a.m.
75 Shashwat 164.32415140 1 May 16, 2019, 3:49 a.m.
76 Saurabh 117.77647081 1 May 27, 2019, 7:27 a.m.
77 falcon 112.95331500 3 May 29, 2019, 4:09 a.m.
78 tarai 101.27567986 3 May 14, 2019, 1:31 a.m.
79 Ashutosh 71.36875808 2 May 16, 2019, 5:34 a.m.
80 jancio 70.78199772 1 June 29, 2019, 5:35 p.m.
81 Larr 61.89057914 1 May 13, 2019, 4:17 a.m.
82 yuying 59.17428125 1 May 20, 2019, 10:41 p.m.
83 xingzoudeyao 3.30617133 1 May 13, 2019, 1:18 a.m.

Data License


This competition is brought to you by IBM and University of Oxford. 

https://arxiv.org/abs/1712.00428


260

Teams

308

Competitors

735

Submissions