Skip to content

Commit e8c3cd0

Browse files
Merge pull request #30 from peizhao-eth/master
Updated project 3 README
2 parents d8bc7ee + ac12b2d commit e8c3cd0

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

Projects/3_Adversarial Search/README.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -35,18 +35,18 @@ $ source activate aind
3535
- Download a copy of the project files from GitHub and navigate to the project folder. (Note: if you've previously downloaded the repository for another project then you can skip the clone command. However, you should run `git pull` to receive any project updates since you cloned the repository.)
3636
```
3737
(aind) $ git clone https://github.com/udacity/artificial-intelligence
38-
(aind) $ cd "artificial-intelligence/Projects/3_Game Playing"
38+
(aind) $ cd "artificial-intelligence/Projects/3_Adversarial Search"
3939
```
4040

4141

4242
## Instructions
4343

44-
You must implement an agent in the `CustomPlayer` class defined in the `game_agent.py` file. The interface definition for game agents only requires you to implement the `.get_action()` method, but you can add any other methods to the class that you deem necessary. You can build a basic agent by combining minimax search with alpha-beta pruning and iterative deepening from lecture.
44+
You must implement an agent in the `CustomPlayer` class defined in the `my_custom_player.py` file. The interface definition for game agents only requires you to implement the `.get_action()` method, but you can add any other methods to the class that you deem necessary. You can build a basic agent by combining minimax search with alpha-beta pruning and iterative deepening from lecture.
4545

4646
**NOTE:** Your agent will **not** be evaluated in an environment suitable for running machine learning or deep learning agents (like AlphaGo); visit an office hours sessions **after** completing the project if you would like guidance on incorporating machine learning in your agent.
4747

4848
#### The get_action() Method
49-
This function is called once per turn for each player. The calling function handles the time limit and
49+
This function is called once per turn for each player. The calling function handles the time limit and
5050
```
5151
def get_action(self, state):
5252
import random
@@ -78,7 +78,7 @@ Select at least one of the following to implement and evaluate in your report. (
7878

7979
- Create a performance baseline using `run_search.py` (with the `fair_matches` flag enabled) to evaluate the effectiveness of your agent using the #my_moves - #opponent_moves heuristic from lecture
8080
- Use the same process to evaluate the effectiveness of your agent using your own custom heuristic
81-
81+
8282
**Hints:**
8383
- Research other games (chess, go, connect4, etc.) to get ideas for developing good heuristics
8484
- If the results of your tests are very close, try increasing the number of matches (e.g., >100) to increase your confidence in the results
@@ -97,7 +97,7 @@ Select at least one of the following to implement and evaluate in your report. (
9797
- If the results are very close, try increasing the number of matches (e.g., >100) to increase your confidence in the results
9898

9999
**Adding a basic opening book**
100-
- You will need to write your own code to develop a good opening book, but you can pass data into your agent by saving the file as "data.pickle" in the same folder as `game_agent.py`. Use the [pickle](https://docs.python.org/3/library/pickle.html) module to serialize the object you want to save. The pickled object will be accessible to your agent through the `self.data` attribute.
100+
- You will need to write your own code to develop a good opening book, but you can pass data into your agent by saving the file as "data.pickle" in the same folder as `my_custom_player.py`. Use the [pickle](https://docs.python.org/3/library/pickle.html) module to serialize the object you want to save. The pickled object will be accessible to your agent through the `self.data` attribute.
101101

102102
For example, the contents of dictionary `my_data` can be saved to disk:
103103
```
@@ -115,7 +115,7 @@ with open("data.pickle", 'wb') as f:
115115
- Create a performance baseline using `run_search.py` to evaluate the effectiveness of a baseline agent (e.g., an agent using your minimax or alpha-beta search code from the classroom)
116116
- Use `run_search.py` to evaluate the effectiveness of your agent using your own custom search techniques
117117
- You must decide whether to test with or without "fair" matches enabled--justify your choice in your report
118-
118+
119119
**Hints:**
120120
- If the results are very close, try increasing the number of matches (e.g., >100) to increase your confidence in the results
121121
- Experiment with adding more search time--does adding time confer any advantage to your agent?

0 commit comments

Comments
 (0)