Changes

Jump to: navigation, search

Timeline of Machine Intelligence Research Institute

858 bytes added, 23:41, 19 June 2022
What the timeline is still missing
|-
| 2009 || {{dts|August 13}} || Social media || The Singularity Institute Twitter account, singinst, is created.<ref>{{cite web |url=https://twitter.com/singinst |title=SingularityInstitute (@singinst) |publisher=Twitter |accessdate=July 4, 2017}}</ref>
|-
| 2009 || {{dts|September}} || Staff || Amy Willey Labenz begins an internship at MIRI. During the internship in November, she would uncover the embezzlement.<ref name="amy-email-2022-05-27">Amy Willey Labenz. Personal communication. May 27, 2022.</ref>
|-
| 2009 || {{dts|October}} || Project || A website maintained by MIRI, ''The Uncertain Future'', first appears around this time.<ref>{{cite web |url=https://web.archive.org/web/20090101000000*/http://theuncertainfuture.com/ |title=Wayback Machine |accessdate=July 2, 2017}} The first snapshot is from October 5, 2009.</ref><ref>{{cite web |url=https://www.google.com/search?q=http%3A%2F%2Ftheuncertainfuture.com%2F&source=lnt&tbs=cdr%3A1%2Ccd_min%3A1%2F1%2F2009%2Ccd_max%3A1%2F1%2F2010&tbm= |title=theuncertainfuture.com - Google Search |accessdate=July 2, 2017}} The earliest cache seems to be from October 25, 2009. Checking the Jan 1, 2008 – Jan 1, 2009 range produces no result.</ref> The goal of the website is to "allow those interested in future technology to form their own rigorous, mathematically consistent model of how the development of advanced technologies will affect the evolution of civilization over the next hundred years".<ref>{{cite web |url=http://theuncertainfuture.com/ |title=The Uncertain Future |accessdate=July 2, 2017 |publisher=Machine Intelligence Research Institute}}</ref> Work on the project started in 2008.<ref name=hplus-tuf>{{cite web|url = http://hplusmagazine.com/2011/02/04/the-uncertain-future-forecasting-project-goes-open-source/|title = The Uncertain Future Forecasting Project Goes Open-Source|archiveurl = http://web.archive.org/web/20120413174829/http://hplusmagazine.com/2011/02/04/the-uncertain-future-forecasting-project-goes-open-source/|date = February 4, 2011|archivedate = April 13, 2012|accessdate = July 15, 2017|publisher = H Plus Magazine|last = McCabe|first = Thomas}}</ref>
| 2009 || {{dts|October 3}}–4 || Conference || The Singularity Summit 2009 takes place in New York.<ref>{{cite web |url=https://web.archive.org/web/20091217213848/http://www.singularitysummit.com/program |author=http://helldesign.net |title=The Singularity Summit 2009 &gt; Program |accessdate=June 30, 2017}}</ref><ref>{{cite web |url=http://www.popsci.com/scitech/article/2009-10/singularity-summit-2009-singularity-near |publisher=Popular Science |title=Singularity Summit 2009: The Singularity Is Near |accessdate=June 30, 2017 |date=October 2, 2009 |author=Stuart Fox}}</ref>
|-
| 2009 || {{dts|November}} || Financial || Embezzlement: "Misappropriation of assets, by a contractor, was discovered in November 2009."<ref>{{cite web |url=https://intelligence.org/files/2009-SIAI990.pdf |title=Form 990 2009 |accessdate=July 8, 2017}}</ref>
|-
| 2009 || {{dts|December}} || Staff || Amy Willey Labenz, previously an intern, joins MIRI as Chief Compliance Officer, partly due to her uncovering of the embezzlement in November.<ref name="siai_accomplishments_20110621" /><ref name="amy-email-2022-05-27" />
|-
| 2009 || {{dts|December 11}} || Influence || The third edition of ''[[wikipedia:Artificial Intelligence: A Modern Approach|Artificial Intelligence: A Modern Approach]]'' by [[wikipedia:Stuart J. Russell|Stuart J. Russell]] and [[wikipedia:Peter Norvig|Peter Norvig]] is published. In this edition, for the first time, Friendly AI is mentioned and Eliezer Yudkowsky is cited.
|-
| 2010 || {{dts|February 28}} || Publication || The first chapter of Eliezer Yudkowsky's fan fiction ''{{w|Harry Potter and the Methods of Rationality}}'' is published. The book would be published as a serial concluding on March 14, 2015.<ref>{{cite web |url=https://www.fanfiction.net/s/5782108/1/Harry-Potter-and-the-Methods-of-Rationality |title=Harry Potter and the Methods of Rationality Chapter 1: A Day of Very Low Probability, a harry potter fanfic |publisher=FanFiction |accessdate=July 1, 2017 |quote=Updated: 3/14/2015 - Published: 2/28/2010}}</ref><ref>{{cite web |url=https://www.vice.com/en_us/article/gq84xy/theres-something-weird-happening-in-the-world-of-harry-potter-168 |publisher=Vice |title=The Harry Potter Fan Fiction Author Who Wants to Make Everyone a Little More Rational |date=March 2, 2015 |author=David Whelan |accessdate=July 1, 2017}}</ref> The fan fiction would become the initial contact with MIRI of several larger donors to MIRI.<ref>{{cite web |url=https://intelligence.org/2014/04/02/2013-in-review-fundraising/#identifier_2_10812 |title=2013 in Review: Fundraising - Machine Intelligence Research Institute |publisher=Machine Intelligence Research Institute |date=August 13, 2014 |accessdate=July 1, 2017 |quote=Recently, we asked (nearly) every donor who gave more than $3,000 in 2013 about the source of their initial contact with MIRI, their reasons for donating in 2013, and their preferred methods for staying in contact with MIRI. [&hellip;] Four came into contact with MIRI via HPMoR.}}</ref>
|-
| 2010 || {{dts|April}} || Staff || Amy Willey Labenz is promoted to Chief Operating Officer; she was previously the Chief Compliance Officer. From 2010 to 2012 she would also serve as the Executive Producer of the Singularity Summits.<ref name="amy-email-2022-05-27"/>
|-
| 2010 || {{dts|June 17}} || Popular culture || ''{{w|Zendegi}}'', a science fiction book by {{w|Greg Egan}}, is published. The book includes a character called Nate Caplan (partly inspired by Eliezer Yudkowsky and Robin Hanson), a website called Overpowering Falsehood dot com (partly inspired by Overcoming Bias and LessWrong), and a Benign Superintelligence Bootstrap Project, inspired by the Singularity Institute's friendly AI project.<ref>{{cite web|url = http://gareth-rees.livejournal.com/31182.html|title = Zendegi - Gareth Rees|date = August 17, 2010|accessdate = July 15, 2017|last = Rees|first = Gareth}}</ref><ref>{{Cite web|url = http://lesswrong.com/lw/2ti/greg_egan_disses_standins_for_overcoming_bias/|title = Greg Egan disses stand-ins for Overcoming Bias, SIAI in new book|last = Sotala|first = Kaj|date = October 7, 2010|accessdate = July 15, 2017}}</ref><ref>{{cite web|url = http://www.overcomingbias.com/2012/03/egans-zendegi.html|title = Egan’s Zendegi|date = March 25, 2012|accessdate = July 15, 2017|last = Hanson|first = Robin}}</ref>
| 2012 || {{dts|November 11}}–18 || Workshop || The 1st Workshop on Logic, Probability, and Reflection takes place.<ref name="workshops">{{cite web |url=https://intelligence.org/workshops/ |title=Research Workshops - Machine Intelligence Research Institute |publisher=Machine Intelligence Research Institute |accessdate=July 1, 2017}}</ref>
|-
| 2012 || {{dts|December 6}} || || Singularity University announces that it has acquired the Singularity Summit from MIRI.<ref>{{cite web |url=http://singularityu.org/2012/12/09/singularity-university-acquires-the-singularity-summit/ |title=Singularity University Acquires the Singularity Summit |publisher=Singularity University |date=December 9, 2012 |accessdate=June 30, 2017}}</ref> Joshua Fox praises the move, noting: "The Singularity Summit was always off-topic for SI: more SU-like than SI-like."<ref name=singularity-wars>{{cite web|url = http://lesswrong.com/lw/gn4/the_singularity_wars/|title = The Singularity Wars|last = Fox|first = Joshua|date = February 14, 2013|accessdate = July 15, 2017|publisher = LessWrong}}</ref> However, Singularity University would not continue the original tradition of the Summit,<ref>{{cite web|url = https://www.facebook.com/groups/200945030405983/permalink/228705307629955/|title = The Singularity Summit was an annual event from 2006 through 2012|last = Vance|first = Alyssa|date = May 27, 2017|accessdate = July 15, 2017}}</ref> and the later EA Global conference (organized in some years by Amy Willey Labenz who used to work at MIRI) would inherit some of the characteristics of the Singularity Summit.<ref>{{cite web|url = https://groups.google.com/forum/#!topic/long-term-world-improvement/oYSW9XfA-FY|title = EA Global Boston|last = Vance|first = Alyssa|last2 = Sotala|first2 = Kaj|last3 = Luczkow|first3 = Vincent|date = June 6, 2017|accessdate = July 15, 2017}}</ref> Around this time, Amy Willey Labenz also leaves MIRI.<ref name="amy-email-2022-05-27"/>
|-
| 2013 || || Mission || The organization mission changes to: "To ensure that the creation of smarter-than-human intelligence has a positive impact. Thus, the charitable purpose of the organization is to: a) perform research relevant to ensuring that smarter-than-human intelligence has a positive impact; b) raise awareness of this important issue; c) advise researchers, leasers and laypeople around the world; d) as necessary, implement a smarter-than-human intelligence with humane, stable goals."<ref>{{cite web |url=https://intelligence.org/wp-content/uploads/2012/06/2013-990.pdf |title=Form 990 2013 |accessdate=July 8, 2017}}</ref> This mission would stay the same for 2014 and 2015.
* modal combat and some other domains: [https://github.com/machine-intelligence/provability/blob/master/src/ModalCombat.hs], [http://viewdns.info/reversewhois/?q=Machine%20Intelligence%20Research%20Institute], [http://viewdns.info/reversewhois/?q=Nate%20Soares]
* https://www.lesswrong.com/posts/yGZHQYqWkLMbXy3z7/video-q-and-a-with-singularity-institute-executive-director
* https://ea.greaterwrong.com/posts/NBgpPaz5vYe3tH4ga/on-deference-and-yudkowsky-s-ai-risk-estimates
===Timeline update strategy===

Navigation menu