Changes

Jump to: navigation, search

Timeline of Machine Intelligence Research Institute

3,980 bytes added, 13:24, 14 February 2019
no edit summary
| 2014 || {{dts|March}}–May || Influence || [[wikipedia:Future of Life Institute|Future of Life Institute]] (FLI) is founded.<ref>{{cite web |url=http://lesswrong.com/lw/kcm/new_organization_future_of_life_institute_fli/ |title=New organization - Future of Life Institute (FLI) |author=Victoria Krakovna |accessdate=July 6, 2017 |publisher=[[wikipedia:LessWrong|LessWrong]] |quote=As of May 2014, there is an existential risk research and outreach organization based in the Boston area. The Future of Life Institute (FLI), spearheaded by Max Tegmark, was co-founded by Jaan Tallinn, Meia Chita-Tegmark, Anthony Aguirre and myself.}}</ref> MIRI is a parter organization to FLI.<ref>{{cite web |url=https://futureoflife.org/news-from-our-partner-organizations/ |title=News from our Partner Organizations |publisher=Future of Life Institute |accessdate=July 6, 2017}}</ref> The Singularity Summit, MIRI's annual conference from 2006–2012, also played "a key causal role in getting [[wikipedia:Max Tegmark|Max Tegmark]] interested and the FLI created".<ref name="shulman_miri_causal_influences" /> "Tallinn, a co-founder of FLI and of the Cambridge Centre for the Study of Existential Risk (CSER), cites MIRI as a key source for his views on AI risk".<ref>{{cite web |url=https://intelligence.org/2015/08/10/assessing-our-past-and-potential-impact/ |title=Assessing our past and potential impact |publisher=Machine Intelligence Research Institute |author=Rob Bensinger |date=August 10, 2015 |accessdate=July 6, 2017}}</ref>
|-
| 2014 || {{dts|March 12}}–13 || Staff || Some recent hires at MIRI are announced. Among the new team members is Nate Soares, who would become MIRI's executive director in 2015.<ref name="recent_hires_at_miri_mar_2014" /> MIRI also hosts an Expansion Party to announce these hires to local supporters.<ref>{{cite web |url=https://intelligence.org/2014/03/18/miris-march-2014-newsletter/ |title=MIRI's March 2014 Newsletter |publisher=[[wikipedia:Machine Intelligence Research Institute|Machine Intelligence Research Institute]] |date=March 18, 2014 |accessdate=May 27, 2018 |first=Luke |last=Muehlhauser |quote=We recently hired recently hired four new researchers, including two new Friendly AI researchers. We We announced this to our local supporters at the recent MIRI Expansion Party.}}</ref><ref>{{cite web |url=https://www.facebook.com/pg/MachineIntelligenceResearchInstitute/photos/?tab=album&album_id=655204764516911 |title=Machine Intelligence Research Institute - Photos |publisher=Facebook |accessdate=May 27, 2018}}</ref><ref>{{cite web |url=https://rockstarresearch.com/miri-expansion-party-with-one-medical-group/ |title=MIRI Expansion Party with One Medical Group |publisher=Rockstar Research |accessdate=May 27, 2018 |first=Louie |last=Helm |quote=RSVP for the MIRI Expansion Party w/ One Medical – March 12, 2014}}</ref>
|-
| 2014 || {{dts|May 3}}–11 || Workshop || The 7th Workshop on Logic, Probability, and Reflection takes place.<ref name="workshops" />
|-
| 2017 || {{dts|December 1}} || Financial || MIRI's 2017 fundraiser begins. The announcement post describes MIRI's fundraising targets, recent work at MIRI (including recent hires), and MIRI's strategic background (which gives a high-level overview of how MIRI's work relates to long-term outcomes).<ref>{{cite web |url=https://intelligence.org/2017/12/01/miris-2017-fundraiser/ |title=MIRI's 2017 Fundraiser |publisher=[[wikipedia:Machine Intelligence Research Institute|Machine Intelligence Research Institute]] |author=Malo Bourgon |date=December 1, 2017 |accessdate=December 12, 2017}}</ref> The fundraiser would conclude with $2.5 million raised from over 300 distinct donors. The largest donation would be from {{w|Vitalik Buterin}} ($763,970 worth of {{W|Ethereum}}).<ref>{{cite web |url=https://intelligence.org/2018/01/10/fundraising-success/ |title=Fundraising success! |author=Malo Bourgon |publisher=[[wikipedia:Machine Intelligence Research Institute|Machine Intelligence Research Institute]] |date=January 10, 2018 |accessdate=January 30, 2018}}</ref>
|-
| 2018 || {{dts|October 29}}{{snd}}November 15 || Publication || The ''Embedded Agency'' sequence, by Abram Demski and Scott Garrabrant, is published on the MIRI blog (text version),<ref>{{cite web |url=https://intelligence.org/embedded-agency/ |title=Embedded Agency |publisher=Machine Intelligence Research Institute |accessdate=February 14, 2019}}</ref> on LessWrong 2.0 (illustrated version),<ref>{{cite web |url=https://www.lesswrong.com/s/Rm6oQRJJmhGCcLvxh |title=Embedded Agency |publisher=LessWrong 2.0 |accessdate=February 14, 2019 |date=October 29, 2018 |publisher=LessWrong}}</ref> and on the Alignment Forum (illustrated version)<ref>{{cite web |url=https://www.alignmentforum.org/s/Rm6oQRJJmhGCcLvxh |title=Embedded Agency |publisher=AI Alignment Forum |date=October 29, 2018 |accessdate=February 14, 2019}}</ref> in serialized installments from October 29 to November 8; on November 15 a full-text version is published.<ref>{{cite web |url=https://twitter.com/miriberkeley/status/1063166929899159552 |title=MIRI on Twitter |publisher=Twitter |accessdate=February 14, 2019 |quote="Embedded Agency" in finished form, with new material on self-reference and logical uncertainty}}</ref> The term "embedded agency" is a renaming of an existing concept researched at MIRI, called "naturalized agency".<ref>{{cite web |url=https://www.greaterwrong.com/posts/p7x32SEt43ZMC9r7r/embedded-agents/comment/rHjqqouz4KRG8Dj7y |author=Rob Bensinger |title=Rob Bensinger comments on Embedded Agents |publisher=LessWrong 2.0 viewer |accessdate=February 14, 2019}}</ref>
|-
| 2018 || {{dts|November 22}} || Strategy || Nate Soares, executive director of MIRI, publishes MIRI's 2018 update post (the post was not written exclusively by Soares; see footnote 1, which begins "This post is an amalgam put together by a variety of MIRI staff"). The post describes new research directions at MIRI (which are not explained in detail due to MIRI's nondisclosure policy); explains the concept of "deconfusion" and why MIRI values it; announces MIRI's "nondisclosed-by-default" policy for most of its research; and gives a recruitment pitch for people to join MIRI.<ref>{{cite web |url=https://intelligence.org/2018/11/22/2018-update-our-new-research-directions/ |title=2018 Update: Our New Research Directions - Machine Intelligence Research Institute |publisher=Machine Intelligence Research Institute |date=November 22, 2018 |accessdate=February 14, 2019}}</ref>
|-
| 2018 || {{dts|November 26}} || Financial || MIRI's 2018 fundraiser begins.<ref>{{cite web |url=https://intelligence.org/2018/11/26/miris-2018-fundraiser/ |title=MIRI's 2018 Fundraiser |publisher=Machine Intelligence Research Institute |date=November 26, 2018 |accessdate=February 14, 2019}}</ref> The fundraiser would conclude on December 31 with $951,817 raised from 348 donors.<ref>{{cite web |url=https://intelligence.org/2019/02/11/our-2018-fundraiser-review/ |title=Our 2018 Fundraiser Review - Machine Intelligence Research Institute |publisher=Machine Intelligence Research Institute |date=February 11, 2019 |accessdate=February 14, 2019}}</ref>
|-
| 2018 || {{dts|December 15}} || Publication || MIRI announces a new edition of Eliezer Yudkowsky's ''Rationality: From AI to Zombies'' (i.e. the book version of "the Sequences"). At the time of the announcement, the new edition of only two sequences, ''Map and Territory'' and ''How to Actually Change Your Mind'', are available.<ref>{{cite web |url=https://intelligence.org/2018/12/15/announcing-new-raz/ |title=Announcing a new edition of "Rationality: From AI to Zombies" |publisher=Machine Intelligence Research Institute |date=December 16, 2018 |accessdate=February 14, 2019}}</ref><ref>{{cite web |url=https://www.lesswrong.com/posts/NjFgqv8bzjhXFaELP/new-edition-of-rationality-from-ai-to-zombies |title=New edition of "Rationality: From AI to Zombies" |publisher=LessWrong 2.0 |author=Rob Bensinger |accessdate=February 14, 2019 |publisher=LessWrong}}</ref>
|}

Navigation menu