Task #

Task Title

Year 1

Year 2

Year 3

 

 

Qtr 1

Qtr 2

Qtr 3

Qtr 4

Qtr 1

Qtr 2

Qtr 3

Qtr 4

Qtr 1

Qtr 2

Qtr 3

Qtr 4

1.

Neutronics Benchmark Task Lead – T. Downar, UM

 

 

 

 

 

 

 

 

 

 

 

 

1.1

Steady State (SS)

 

 

 

 

 

 

 

 

 

 

 

 

   1.1.1

Survey candidate problems

 

 

 

 

 

 

 

 

 

 

 

 

   1.1.2

Preliminary SS modeling of candidate problems

 

 

 

 

 

 

 

 

 

 

 

 

   1.1.3

Down-select to two problems for benchmark evaluation

+

 

 

 

 

 

 

 

 

 

 

 

   1.1.4

SS modeling with deterministic U.S. NRC codes PARCS/AGREE

 

 

 

 

 

 

 

 

 

 

 

 

   1.1.5

SS modeling with deterministic NEAMS code PROTEUS

 

 

 

 

 

 

 

 

 

 

 

 

   1.1.6

SS modeling with Monte Carlo code OPENMC

 

 

 

 

 

 

 

 

 

 

 

 

   1.1.7

Comparison of experimental data & model results

 

 

 

 

 

 

 

 

 

 

 

 

   1.1.8

Benchmark level evaluation of selected problems

 

 

+

 

 

 

 

 

 

 

 

 

   1.1.9

Evaluation of uncertainties in selected problems

 

 

 

 

 

 

 

 

 

 

 

 

   1.1.10

Preparation of IRPhEP documentation

 

 

 

 

 

 

 

 

 

 

 

 

   1.1.11

Submission of SS benchmark for peer review

 

 

 

=

 

 

 

 

 

 

 

 

   1.2

Transient (TR)

 

 

 

 

 

 

 

 

 

 

 

 

   1.2.1

Survey available TREAT TR data for benchmark problem

 

 

 

 

 

 

 

 

 

 

 

 

   1.2.2

Preliminary TR modeling of candidate problems

 

 

 

 

 

 

 

 

 

 

 

 

   1.2.3

Down-select to two problems for benchmark evaluation

 

 

 

 

 

+

 

 

 

 

 

 

   1.2.4

Perform TR modeling with deterministic U.S. NRC codes PARCS/AGREE

 

 

 

 

 

 

 

 

 

 

 

 

   1.2.5

Perform TR modeling with deterministic NEAMS code PROTEUS

 

 

 

 

 

 

 

 

 

 

 

 

   1.2.6

Perform TR modeling with Monte Carlo code OPENMC

 

 

 

 

 

 

 

 

 

 

 

 

   1.2.7

Benchmark level evaluation of selected problems

 

 

 

 

 

 

 

 

+

 

 

 

   1.2.8

Evaluation of uncertainties in selected problems

 

 

 

 

 

 

 

 

 

 

 

 

   1.2.9

Preparation of IRPhE Documentation

 

 

 

 

 

 

 

 

 

 

 

 

   1.2.10

Submission of TR benchmark for peer review

 

 

 

 

 

 

 

 

 

 

 

=

2.

Loop Thermal-Hydraulics Task Lead – W. Marcum, OSU

 

 

 

 

 

 

 

 

 

 

 

 

2.1

Sodium Loop

 

 

 

 

 

 

 

 

 

 

 

 

   2.1.1

Survey literature of existing sodium test data

 

 

 

 

 

 

 

 

 

 

 

 

   2.1.2

Select two candidate problems

 

 

 

 

 

 

 

 

 

 

 

 

   2.1.3

Organize and document data for two candidate problems

 

=

 

 

 

 

 

 

 

 

 

 

   2.1.4

Identify and review industry needs for sodium loop data

 

 

 

 

 

 

 

 

 

 

 

 

   2.1.5

Down-select to one problem for benchmark evaluation

 

 

+

 

 

 

 

 

 

 

 

 

   2.1.6

Preliminary modeling with industry tool Star CCM+

 

 

 

 

 

 

 

 

 

 

 

 

   2.1.7

Preliminary modeling with NEAMS code Nek5000

 

 

 

 

 

 

 

 

 

 

 

 

   2.1.8

Comparison of experimental data & model results for problem

 

 

 

 

 

+

 

 

 

 

 

 

   2.1.9

Benchmark level evaluation of problem

 

 

 

 

 

 

 

 

 

 

 

 

   2.1.10

Evaluation of uncertainties in selected problem

 

 

 

 

 

 

 

 

 

 

 

 

   2.1.11

Submission of benchmark for peer review

 

 

 

 

 

 

 

 

 

 

=

 

2.2

Water Loop

 

 

 

 

 

 

 

 

 

 

 

 

   2.2.1

Identify and review industry needs for water loop

 

 

 

 

 

 

 

 

 

 

 

 

   2.2.2

Develop loop technical and functional requirements

 

 

 

 

 

 

 

 

 

 

 

 

   2.2.3

Loop design

 

 

+

 

 

 

 

 

 

 

 

 

   2.2.4

Loop fabrication

 

 

 

 

 

 

 

 

 

 

 

 

   2.2.5

Loop shakedown

 

 

 

 

 

 

 

 

 

 

 

 

   2.2.6

Define flow loop ‘operations tests’ and ‘benchmark tests’

 

 

 

 

 

+

 

 

 

 

 

 

   2.2.7

Operations test conduct

 

 

 

 

 

 

 

 

 

 

 

 

   2.2.8

Synthesis of operations tests data

 

 

 

 

 

 

 

 

 

 

 

 

   2.2.9

Benchmark test conduct

 

 

 

 

 

 

 

 

 

 

 

 

   2.2.10

Synthesis of benchmark test data

 

 

 

 

 

 

 

 

 

 

 

 

   2.2.11

Modeling of benchmark test with U.S. NRC code TRACE

 

 

 

 

 

 

 

 

 

 

 

 

   2.2.12

Modeling of benchmark test with RELAP5-3D

 

 

 

 

 

 

 

 

 

 

 

 

   2.2.13

Comparison of experimental data & model results for problem

 

 

 

 

 

 

 

 

 

 

 

 

   2.2.14

Benchmark level evaluation of problem

 

 

 

 

 

 

 

 

 

 

 

 

   2.2.15

Evaluation of uncertainties in selected problem

 

 

 

 

 

 

 

 

 

 

 

 

   2.2.16

Submission of benchmark for peer review

 

 

 

 

 

 

 

 

 

 

 

=

3.

Core Instrumentation Task Lead – L.W. Hu, MIT

 

 

 

 

 

 

 

 

 

 

 

 

3.1

Instrumentation Plan

 

 

 

 

 

 

 

 

 

 

 

 

   3.1.1

Review TREAT core design and test plans

 

 

 

 

 

 

 

 

 

 

 

 

   3.1.2

Identify core parameter monitoring needs

 

 

 

 

 

 

 

 

 

 

 

 

   3.1.3

Determine applicable flux/measurement range

 

 

 

 

 

 

 

 

 

 

 

 

   3.1.4

Select TREAT core instrumentation

 

 

+

 

 

 

 

 

 

 

 

 

   3.1.5

Identify instrumentation calibration requirements

 

 

 

 

 

 

 

 

 

 

 

 

   3.1.6

Develop TREAT core instrumentation plan

 

 

 

=

 

 

 

 

 

 

 

 

3.2

Initial Benchmark Evaluation

 

 

 

 

 

 

 

 

 

 

 

 

   3.2.1

Develop Benchmark experimental plan using the MITR

 

 

 

 

 

 

 

 

 

 

 

 

   3.2.2

Select test instrumentation

 

 

 

 

 

 

 

 

 

 

 

 

   3.2.3

Design test assembly for OSTR and MITR

 

 

 

 

 

 

 

 

 

 

 

 

   3.2.4

Conduct experiment safety review and approval

 

 

 

 

 

 

 

 

 

 

 

 

   3.2.5

Perform core analysis with MCODE

 

 

 

 

 

 

 

 

 

 

 

 

   3.2.6

Assemble and test data acquisition systems

 

 

 

 

 

+

 

 

 

 

 

 

   3.2.7

Perform steady-state experiments at MITR

 

 

 

 

 

 

 

 

 

 

 

 

   3.2.8

Perform steady-state experiments at OSTR

 

 

 

 

 

 

 

 

 

 

 

 

   3.2.9

Perform transient experiments at MITR

 

 

 

 

 

 

 

 

 

 

 

 

   3.2.10

Perform transient experiments at OSTR

 

 

 

 

 

 

 

 

 

 

 

 

   3.2.11

Analyze experimental data

 

 

 

 

 

 

 

 

 

 

 

 

   3.2.12

Evaluate core analysis and instrumentation measurement uncertainties

 

 

 

 

 

 

 

 

 

 

 

 

   3.2.13

Submission of detailed final instrumentation report

 

 

 

 

 

 

 

 

 

 

 

=

4.

Meetings

 

 

 

 

 

 

 

 

 

 

 

 

   4.1

IRP Kickoff Meeting (Idaho National Laboratory)

 

 

 

 

 

 

 

 

 

 

 

 

   4.2

Biannual Collaborator’s Meeting (University of Michigan)

 

 

 

 

 

 

 

 

 

 

 

 

   4.3

Annual Review Meeting (Massachusetts Institute of Technology)

 

 

 

 

 

 

 

 

 

 

 

 

   4.4

Biannual Collaborator’s Meeting (TBD)

 

 

 

 

 

 

 

 

 

 

 

 

   4.5

Annual Review Meeting (Oregon State University)

 

 

 

 

 

 

 

 

 

 

 

 

   4.6

Biannual Collaborator’s Meeting (TBD)

 

 

 

 

 

 

 

 

 

 

 

 

   4.7

Monthly Conference Call