Package: binaryRL
Version: 0.9.0
Title: Reinforcement Learning Tools for Two-Alternative Forced Choice
        Tasks
Description: Tools for building reinforcement learning (RL) models
  specifically tailored for Two-Alternative Forced Choice (TAFC) tasks,
  commonly employed in psychological research. These models build upon
  the foundational principles of model-free reinforcement learning detailed in
  Sutton and Barto (2018) <ISBN:9780262039246>. The package allows
  for the intuitive definition of RL models using simple if-else
  statements. Our approach to constructing and evaluating these
  computational models is informed by the guidelines proposed in
  Wilson & Collins (2019) <doi:10.7554/eLife.49547>. Example
  datasets included with the package are sourced from the work of
  Mason et al. (2024) <doi:10.3758/s13423-023-02415-x>.
Authors@R: 
  c(person(
    given = "YuKi",
    role = c("aut", "cre"),
    email = "hmz1969a@gmail.com",
    comment = c(ORCID = "0009-0000-1378-1318")
  ))
Maintainer: YuKi <hmz1969a@gmail.com>
URL: https://yuki-961004.github.io/binaryRL/
BugReports: https://github.com/yuki-961004/binaryRL/issues
License: GPL-3
Encoding: UTF-8
LazyData: TRUE
RoxygenNote: 7.3.2
Depends: R (>= 4.0.0)
Imports: future, doFuture, foreach, doRNG, progressr
Suggests: stats, GenSA, GA, DEoptim, pso, mlrMBO, mlr, ParamHelpers,
        smoof, lhs, DiceKriging, rgenoud, cmaes, nloptr
NeedsCompilation: no
Packaged: 2025-07-08 13:00:55 UTC; hmz19
Author: YuKi [aut, cre] (ORCID: <https://orcid.org/0009-0000-1378-1318>)
Repository: CRAN
Date/Publication: 2025-07-08 13:20:02 UTC
Built: R 4.3.3; ; 2025-07-08 14:46:31 UTC; unix
