ResearchSpace

Speeding up Online POMDP planning - unification of observation branches by belief-state compression via expected feature values

Show simple item record

dc.contributor.author Rens, G
dc.date.accessioned 2016-02-23T09:05:47Z
dc.date.available 2016-02-23T09:05:47Z
dc.date.issued 2015-01
dc.identifier.citation Rens, G. 2015. Speeding up Online POMDP planning - unification of observation branches by belief-state compression via expected feature values. In: 7th International Conference on Agents and Artificial Intelligence (ICAART) 2015, Lisbon Marriott Hotel, Portugal, 10 - 12 January 2015 en_US
dc.identifier.isbn 978-989-758-074-1
dc.identifier.uri http://www.scitepress.org/DigitalLibrary/ProceedingsDetails.aspx?ID=+mGlly8Sp00=&t=1
dc.identifier.uri http://hdl.handle.net/10204/8409
dc.description 7th International Conference on Agents and Artificial Intelligence (ICAART) 2015, Lisbon Marriott Hotel, Portugal, 10 - 12 January 2015. Due to copyright restrictions, the attached PDF file only contains the abstract of the full text item. For access to the full text item, please consult the publisher's website en_US
dc.description.abstract A novel algorithm to speed up online planning in partially observable Markov decision processes (POMDPs) is introduced. I propose a method for compressing nodes in belief-decision-trees while planning occurs. Whereas belief-decision-trees branch on actions and observations, with my method, they branch only on actions. This is achieved by unifying the branches required due to the nondeterminism of observations. The method is based on the expected values of domain features. The new algorithm is experimentally compared to three other online POMDP algorithms, outperforming them on the given test domain. en_US
dc.language.iso en en_US
dc.publisher Scitepress Digital Library en_US
dc.relation.ispartofseries Workflow;15635
dc.subject Online POMDP planning en_US
dc.subject POMDP en_US
dc.subject Partially Observable Markov Decision Processes en_US
dc.subject Heuristic en_US
dc.subject Optimization en_US
dc.subject Belief-state Compression en_US
dc.subject Expected Feature Values en_US
dc.title Speeding up Online POMDP planning - unification of observation branches by belief-state compression via expected feature values en_US
dc.type Conference Presentation en_US
dc.identifier.apacitation Rens, G. (2015). Speeding up Online POMDP planning - unification of observation branches by belief-state compression via expected feature values. Scitepress Digital Library. http://hdl.handle.net/10204/8409 en_ZA
dc.identifier.chicagocitation Rens, G. "Speeding up Online POMDP planning - unification of observation branches by belief-state compression via expected feature values." (2015): http://hdl.handle.net/10204/8409 en_ZA
dc.identifier.vancouvercitation Rens G, Speeding up Online POMDP planning - unification of observation branches by belief-state compression via expected feature values; Scitepress Digital Library; 2015. http://hdl.handle.net/10204/8409 . en_ZA
dc.identifier.ris TY - Conference Presentation AU - Rens, G AB - A novel algorithm to speed up online planning in partially observable Markov decision processes (POMDPs) is introduced. I propose a method for compressing nodes in belief-decision-trees while planning occurs. Whereas belief-decision-trees branch on actions and observations, with my method, they branch only on actions. This is achieved by unifying the branches required due to the nondeterminism of observations. The method is based on the expected values of domain features. The new algorithm is experimentally compared to three other online POMDP algorithms, outperforming them on the given test domain. DA - 2015-01 DB - ResearchSpace DP - CSIR KW - Online POMDP planning KW - POMDP KW - Partially Observable Markov Decision Processes KW - Heuristic KW - Optimization KW - Belief-state Compression KW - Expected Feature Values LK - https://researchspace.csir.co.za PY - 2015 SM - 978-989-758-074-1 T1 - Speeding up Online POMDP planning - unification of observation branches by belief-state compression via expected feature values TI - Speeding up Online POMDP planning - unification of observation branches by belief-state compression via expected feature values UR - http://hdl.handle.net/10204/8409 ER - en_ZA


Files in this item

This item appears in the following Collection(s)

Show simple item record