---
title: "Jam 08 - Exercise 1"
tags:
- 2 ๐ in writing
- 3 ๐งช in testing
- 4 ๐ฅณ done
---
<!-- markdownlint-disable line-length single-h1 no-inline-html -->
<!-- markdownlint-configure-file { "ul-indent": { "indent": 4 }, "link-fragments": {"ignore_case": true} } -->
{%hackmd dJZ5TulxSDKme-3fSY4Lbw %}
# Exercise 1 - Stock Data Model
## Overview - Exercise 1
Before we can visualize stock market data, we need to create a robust model to represent and process it. We'll build this in steps:
1. Create a basic data class to represent a single stock record
2. Build a utility class to load stock data from files
3. Implement a model class that uses Streams API for data analysis
## Understanding Streams
Before we start implementing our classes, let's take a deep dive into understanding Streams and how they can help us process our stock data. We'll use Streams in both our data loading and analysis code.
A Stream is a sequence of elements that can be processed in parallel or sequentially. Think of it like a pipeline where data flows through various operations. The key benefits of using Streams are:
- **Declarative**: You describe what you want to do, not how to do it
- **Lazy**: Operations are only performed when needed
- **Parallelizable**: Can automatically take advantage of multiple cores
- **Composable**: Operations can be chained together
### Stream Operations Deep Dive
Let's explore each type of Stream operation in detail using a consistent example: processing a list of student grades. This will help us see how operations can be chained together to solve real problems.
#### 1. Creating Streams
Before we can use Stream operations, we need to create a Stream. There are several ways to do this:
```java
import java.util.Arrays;
import java.util.List;
import java.util.stream.Stream;
// Our input data
List<Integer> grades = Arrays.asList(85, 92, 85, 78, 95, 92, 88, 85);
System.out.println("Original grades: " + grades);
// Output: Original grades: [85, 92, 85, 78, 95, 92, 88, 85]
// Create a Stream from the list
Stream<Integer> gradeStream = grades.stream();
// Note: The original list is unchanged
```
#### 2. Filtering Operations
Filtering operations help us select specific elements from our Stream:
```java
// Our input data
List<Integer> grades = Arrays.asList(85, 92, 85, 78, 95, 92, 88, 85);
System.out.println("Original grades: " + grades);
// Output: Original grades: [85, 92, 85, 78, 95, 92, 88, 85]
// Note: The original list is unchanged
// Create a Stream and apply filtering operations
Stream<Integer> gradeStream = grades.stream()
// Removes duplicate grades (e.g., [85, 92, 85] -> [85, 92])
.distinct()
// Keeps only grades >= 85 (e.g., [85, 92, 78] -> [85, 92])
.filter(grade -> grade >= 85)
// Keeps only the first 3 grades (e.g., [85, 92, 78, 95] -> [85, 92, 78])
.limit(3)
// Removes the first grade (e.g., [85, 92, 78] -> [92, 78])
.skip(1);
```
#### 3. Mapping Operations
Mapping operations transform each element in the Stream:
```java
// Our input data
List<List<Integer>> gradeRanges = Arrays.asList(
Arrays.asList(85, 86, 87),
Arrays.asList(92, 93)
);
System.out.println("Original grade ranges: " + gradeRanges);
// Output: Original grade ranges: [[85, 86, 87], [92, 93]]
// Note: The original list is unchanged
// Creates a Stream containing all grades from all ranges
Stream<Integer> flattenedGradesStream = gradeRanges.stream()
.flatMap(List::stream);
```
For simple transformations, we can use map:
```java
// Our input data
List<Integer> grades = Arrays.asList(85, 92, 85, 78, 95, 92, 88, 85);
System.out.println("Original grades: " + grades);
// Output: Original grades: [85, 92, 85, 78, 95, 92, 88, 85]
// Note: The original list is unchanged
// Create a Stream and apply mapping operations
Stream<String> gradeStream = grades.stream()
// Converts each grade to a decimal (e.g., 85 -> 0.85)
.mapToDouble(grade -> grade / 100.0)
// Converts each decimal back to a percentage point (e.g., 0.85 -> 85)
.mapToInt(percentage -> (int)(percentage * 100))
// Converts each percentage to a letter grade using if statements (e.g., 85 -> "B")
.mapToObj(grade -> {
if (grade >= 90) return "A";
if (grade >= 80) return "B";
if (grade >= 70) return "C";
return "F";
})
// Converts each letter grade to a description using switch (e.g., "B" -> "Good")
.map(letter -> {
switch (letter) {
case "A": return "Excellent";
case "B": return "Good";
case "C": return "Passing";
default: return "Needs Improvement";
}
});
```
#### 4. Chaining Operations
Did you notice something cool in that last mapping example? We chained together multiple operations without any semicolons between them! That's because each operation returns a new Stream, allowing us to build a pipeline where each operation feeds into the next one. We can do the same with any combination of Stream operations.
The key points about chaining are:
1. Operations are applied in order from left to right
2. Each operation processes the elements from the previous operation
3. The final result depends on the order of operations
4. Some operations might eliminate elements, affecting subsequent operations
:::info
๐ง **Chaining Tips**
- Think about the order of operations carefully
- Filtering operations early can reduce the number of elements processed by later operations
- Each operation in the chain transforms the data for the next operation
- The final result is determined by the entire chain of operations
:::
#### 5. Terminal Operations
Terminal operations consume the Stream and produce a final result. Once a terminal operation is performed, the Stream cannot be reused. Let's look at each terminal operation:
First, let's set up our test data:
```java
// Our input data
List<Integer> grades = Arrays.asList(85, 92, 85, 78, 95, 92, 88, 85);
System.out.println("Original grades: " + grades);
// Output: Original grades: [85, 92, 85, 78, 95, 92, 88, 85]
```
**toList()**: Gathers results into a List
```java
// Collect high grades into a List
List<Integer> passingGrades = grades.stream()
.filter(grade -> grade >= 85)
.toList();
System.out.println("High grades: " + passingGrades);
// Output: High grades: [85, 92, 85, 95, 92, 88, 85]
// Note: The Stream is consumed after toList()
```
**reduce(T identity, BinaryOperator<T>)**: Combines elements
:::info
๐ง **Method References**
Remember the lecture on lambda? When you want to use an existing method as a lambda expression, you can use the method reference syntax (`::`). It's a shorthand way to write a lambda that just calls a method. For example, `Integer::sum` is equivalent to `(a, b) -> a + b`. Method references are especially useful when you want to use existing methods from the Java API.
:::
```java
// Calculate the sum of all grades
int total = grades.stream()
.reduce(0, Integer::sum); // Using method reference for addition
System.out.println("Total of all grades: " + total);
// Output: Total of all grades: 674
// Note: The Stream is consumed after reduce()
```
**forEach(Consumer<T>)**: Performs an action on each element
```java
// Print each grade
System.out.println("Printing each grade:");
grades.stream()
.forEach(grade -> System.out.println("Grade: " + grade));
// Output:
// Printing each grade:
// Grade: 85
// Grade: 92
// Grade: 78
// Grade: 95
// Grade: 88
// Grade: 91
// Grade: 76
// Grade: 89
// Note: The Stream is consumed after forEach()
```
**summaryStatistics()**: Calculates statistics
๐ง DoubleSummaryStatistics
The `DoubleSummaryStatistics` class is a special collector that gathers statistics about a stream of numbers. It automatically calculates:
- Count of elements
- Sum of all values
- Average of all values
- Minimum value
- Maximum value
This is particularly useful when you need multiple statistics about your data without having to process the stream multiple times.
```java
// Calculate statistics about the grades
DoubleSummaryStatistics stats = grades.stream()
.mapToDouble(n -> n)
.summaryStatistics();
System.out.println("Grade statistics:");
System.out.println("Average: " + stats.getAverage());
System.out.println("Highest: " + stats.getMax());
System.out.println("Lowest: " + stats.getMin());
// Output:
// Grade statistics:
// Average: 84.25
// Highest: 95.0
// Lowest: 76.0
// Note: The Stream is consumed after summaryStatistics()
```
:::warning
๐ง **Important: Stream Consumption**
- Terminal operations consume the Stream
- After a terminal operation, the Stream cannot be reused
- If you need to use the same data multiple times, you need to create a new Stream each time
```java
Stream<Integer> stream = grades.stream();
List<Integer> firstResult = stream.toList(); // OK
List<Integer> secondResult = stream.toList(); // Error! Stream already consumed
```
:::
:::info
๐ง **Testing Stream Operations**
- Test with empty collections
- Test with single-element collections
- Test with collections containing null values
- Test with collections containing duplicate values
- Test with collections containing special values (e.g., zero, negative numbers)
:::
## Part 1 - The Stock Class
Now that we understand how Streams work, let's put this knowledge into practice by building our stock market data model. We'll use Streams to process stock data efficiently, from loading CSV files to calculating statistics. First, we'll create the foundation - a class to represent individual stock records.
First, let's create a simple class to represent a single stock data point. This class will be immutable since each instance represents historical data that shouldn't change.
```plantuml
@startuml
skinparam classAttributeIconSize 0
skinparam class {
BackgroundColor White
ArrowColor Black
BorderColor Black
}
class Stock {
-<<final>> date: LocalDate
-<<final>> symbol: String
-<<final>> openPrice: BigDecimal
-<<final>> highPrice: BigDecimal
-<<final>> lowPrice: BigDecimal
-<<final>> closePrice: BigDecimal
-<<final>> volume: long
+Stock(csvRecord: String)
+getDate(): LocalDate
+getSymbol(): String
+getOpenPrice(): BigDecimal
+getHighPrice(): BigDecimal
+getLowPrice(): BigDecimal
+getClosePrice(): BigDecimal
+getVolume(): long
+toString(): String
}
@enduml
```
๐ **Implementation Steps**
1. Create the `Stock` class with all fields marked as `final`
2. Implement the constructor `Stock(csvRecord: String)` and parse the CSV record String:
```text
Example: 2024-01-01,AAPL,180.50,185.00,180.00,184.75,1000000
Format: Date,Symbol,Open,High,Low,Close,Volume
```
:::warning
๐ง **CSV Parsing Tips**
- Remember that CSV (Comma-Separated Values) files use commas as delimiters
- Java's String class has methods for splitting strings by a delimiter
- Think about validation:
- How many fields should you have?
- What order are they in?
- What types should each field be?
- Consider what exceptions might occur during parsing:
- Missing fields?
- Invalid date format?
- Non-numeric values where numbers are expected?
:::
3. Use IntelliJ's code generator to create getters (Code โ Generate โ Getter)
4. Implement `toString()` to display the stock data clearly
5. Add proper error handling for malformed records
**Key Points**:
- Use `LocalDate.parse()` for the date
- Use `new BigDecimal(String)` for prices
- Use `Long.parseLong()` for volume
- Throw `IllegalArgumentException` for invalid data
## Part 2 - The StockDataLoader Utility Class
Next, we'll create a utility class for loading stock data from CSV files. This introduces us to the **Utility Class Pattern**, a common design pattern in Java applications.
๐ **Design Pattern: Utility Class**
**What is it?**
- A class containing only static methods and constants
- Cannot be instantiated (private constructor)
- No instance state or behavior
- Often used for file operations, string manipulation, math calculations, etc.
**Why use it?**
- Groups related operations that don't belong to any specific object
- Avoids duplicating utility code across multiple classes
- Provides a clear, centralized location for common operations
- Makes code more maintainable and reusable
**Common Examples in Java API**:
- `java.util.Collections` - Collection utility methods
- `java.util.Arrays` - Array manipulation utilities
- `java.lang.Math` - Mathematical operations
**Key Implementation Rules**:
1. Make the class `final` (prevents inheritance)
2. Add a private constructor (prevents instantiation)
3. Make all methods `static`
4. Keep methods focused and well-named
In our case, `StockDataLoader` will be a utility class because:
- File loading is a standalone operation
- The methods don't need instance state
- The functionality might be useful in other parts of the application
- It provides a clean separation between data loading and data representation
Let's design our utility class following these principles:
```plantuml
@startuml
skinparam classAttributeIconSize 0
skinparam class {
BackgroundColor White
ArrowColor Black
BorderColor Black
}
class StockDataLoader <<final>> {
+{static} EXPECTED_HEADER: String
+{static} loadFromCsv(filename: String): List<Stock> throws IOException
-{static} validateHeader(header: String): void
}
@enduml
```
๐ **Implementation Steps**
1. **Create the Basic Utility Class Structure**
```java
public final class StockDataLoader {
// Private constructor prevents instantiation
private StockDataLoader() {
throw new AssertionError("Utility class - do not instantiate");
}
}
```
:::info
๐ง **Why Private Constructor?**
- Prevents anyone from creating instances of the class
- The `AssertionError` is a common practice to make the intention clear
- Since all methods are static, we never need an instance
:::
2. **Define the Expected CSV Header**
```java
public static final String EXPECTED_HEADER = "Date,Symbol,Open,High,Low,Close,Volume";
```
:::info
๐ง **Why Define the Header?**
- Acts as a contract for what the CSV file should look like
- Makes it easy to verify file format
- Helps with documentation
- If the format changes, we only need to update one place
:::
3. **Implement Header Validation**
- Create a private static method `validateHeader` that takes a String parameter
- Compare the header with `EXPECTED_HEADER`
- Throw `IllegalArgumentException` with a clear message if validation fails
:::info
๐ง **What is Header Validation?**
- Checks if the first line of the CSV matches our expected format
- Helps catch problems early (like wrong file format)
- Provides clear error messages to users
- Common in file processing to ensure data integrity
:::
4. **Create the Main Loading Method**
- Create a public static method `loadFromCsv` that takes a String filename
- First, think about how to read the file safely:
- What Java class is best for reading text files?
- How can we ensure the file is properly closed?
- What exception types might we need to handle?
- Then, consider how to process the data with Streams:
- How to handle the header line?
- How to transform each line into a Stock object?
- How to collect the results?
- The next section (Additional Implementation Details), provides some hints
**Stream Operations to Consider**
- What operation would help you skip the header line?
- What operation would transform each line into a Stock object?
- What operation would gather all the results into a List?
- What other Stream operations might be useful for validation or error handling?
**File Reading Basics**
- `BufferedReader` is perfect for reading text files line by line
- `try-with-resources` ensures the file is closed properly
- `Files.newBufferedReader()` is the modern way to open files
- Think about what happens if the file doesn't exist or can't be read
**Key Points**:
- Make all methods `static` since they belong to the class, not instances
- Use clear, descriptive method and variable names
- Add Javadoc comments to document public methods
- Follow the Single Responsibility Principle - each method does one thing well
### Additional Implementation Details
#### Understanding BufferedReader
- A buffered character-input stream that reads text from a character-input stream
- Buffers the input to provide efficient reading of characters, arrays, and lines
- More efficient than reading one character at a time
- Provides methods like `readLine()` for convenient line-by-line reading
- Example usage:
<!-- -->
```java
BufferedReader reader = Files.newBufferedReader(filePath);
String line = reader.readLine(); // Read one line
```
#### Understanding Buffering
- Instead of reading one character at a time from disk (which is slow), BufferedReader reads a larger chunk of data into memory
- This chunk (the buffer) is then used to satisfy subsequent read requests
- When the buffer is empty, it automatically reads the next chunk from disk
- This reduces the number of expensive disk operations, making file reading much faster
#### Understanding try-with-resources
- A try statement that declares one or more resources
- Resources are automatically closed when the try block ends
- Prevents resource leaks even if exceptions occur
- Works with any class that implements `AutoCloseable` or `Closeable`
- Example usage:
<!-- -->
```java
filePath = Paths.get(filename);
try (BufferedReader reader = Files.newBufferedReader(filePath)) {
String line;
while ((line = reader.readLine()) != null) {
// Process the line
}
} // reader is automatically closed here
```
<!-- -->
- For a refresher on try-with-resources, see zyBooks Section 8.12
#### I/O Exception Handling Patterns
When working with files and I/O operations, you'll encounter several types of exceptions. Here's how to handle them properly:
1. **Common I/O Exceptions**
- `IOException`: Base class for most I/O errors
- `FileNotFoundException`: When a file doesn't exist
- `SecurityException`: When you don't have permission to access the file
- `IllegalArgumentException`: When file path is invalid
2. **Resource Cleanup Best Practices**
- Always use try-with-resources for file operations
- Never close resources manually in finally blocks (try-with-resources handles this)
- Keep resource declarations as close as possible to their usage
- Example of proper resource handling with a reader and a writer:
<!-- -->
```java
// Good: Resources are properly managed
try (BufferedReader reader = Files.newBufferedReader(filePath);
BufferedWriter writer = Files.newBufferedWriter(outputPath)) {
String line;
while ((line = reader.readLine()) != null) {
writer.write(line);
}
} catch (java.nio.file.NoSuchFileException e) {
// Handle missing file
} catch (IOException e) {
// Handle other I/O errors
}
```
3. **Error Handling Patterns**
- **Specific Before General**: Catch specific exceptions before general ones
- **Meaningful Messages**: Include context in error messages
- **Proper Logging**: Use appropriate logging levels
- **Recovery Options**: Provide ways to recover when possible
- Example of good error handling:
<!-- -->
```java
public void readFile(String filename) throws FileNotFoundException, IOException {
try (BufferedReader reader = Files.newBufferedReader(filePath)) {
String line;
while ((line = reader.readLine()) != null) {
System.out.println(line);
}
} catch (java.nio.file.NoSuchFileException e) {
throw new IllegalArgumentException("Stock data file not found: " + filename, e);
} catch (IOException e) {
throw new IllegalStateException("Error reading stock data file: " + filename, e);
} catch (IllegalArgumentException e) {
throw new IllegalArgumentException("Invalid stock data format: " + e.getMessage(), e);
}
}
```
4. **Exception Chaining**
- Use exception chaining to preserve the original error context
- Pass the original exception as the cause
- Add meaningful context in the new exception
- Example:
<!-- -->
```java
try {
// Some I/O operation
} catch (IOException e) {
throw new IllegalStateException("Failed to process stock data", e);
}
```
5. **Catch or Continue to toss**
- So your filereader tosses some exceptions up the line.
- At somepoint you will need to handle these and not just toss them to the eventual main. Soon we will be making GUIs and doing this will call your entire application to crash.
- For this case ask yourself:
- When should the user be getting the feedback?
- Which class has the job of notfiying the user?
- Does it make sense for the DataLoader to talk to the user?
- Does it make sense for the Model to talk to the user?
- Or does it make sense for our GUI to eventually talk to the user?
:::warning
๐ง **Common I/O Pitfalls**
- Don't ignore exceptions (empty catch blocks)
- Don't close resources manually in finally blocks
- Don't catch Exception without good reason
- Don't lose the original exception context
- Don't forget to validate file paths before use
:::
## Part 3 - The StockModel Class
Finally, we'll create the model class that will use the Streams API to analyze our stock data.
### Designing the StockModel Class
Here's a design for our model class that will use these Stream operations:
```plantuml
@startuml
skinparam classAttributeIconSize 0
skinparam class {
BackgroundColor White
ArrowColor Black
BorderColor Black
}
class StockModel {
-stocks: List<Stock>
+StockModel()
+loadData(filename: String): void throws IOException
+getStocks(): List<Stock>
+getStocksBySymbol(symbol: String): List<Stock>
+getStocksByDateRange(start: LocalDate, end: LocalDate): List<Stock>
+getUniqueSymbols(): List<String>
+getDailyPriceChanges(): Map<String, List<Map.Entry<LocalDate, BigDecimal>>>
}
StockModel "0..*" --> Stock: contains
StockModel ..> StockDataLoader: uses
@enduml
```
### Implementation Steps
1. **Create the Basic Structure**
- Create a class named `StockModel`
- Add a field to store the list of stocks
- Think about immutability:
- Should the list reference be able to change after initialization?
- What are the pros and cons of making the field final?
- How does this relate to the immutable design of the `Stock` class?
- Use IntelliJ's code generator (Code โ Generate โ Constructor) to create a no-arg constructor that initializes the list
- Use IntelliJ's code generator (Code โ Generate โ Getter) to create a getter for the list
- Add a `loadData` method that uses `StockDataLoader`
- Think about how to handle loading data with a final list:
- Can you reassign the list reference?
- What operations can you perform on the list itself?
- How can you update the list's contents without changing its reference?
- What should you do if the list already has data in it? What method would help you remove all existing elements before adding new ones?
2. **Implement Data Access Methods** (See UML for more details)
- `getStocks`: A basic getter that returns the stocks list
- `getStocksBySymbol(String symbol)`: Returns stocks for a specific symbol
- This must be solved using streams
- Think about which Stream operations would help filter stocks by symbol
- `getStocksByDateRange(LocalDate start, LocalDate end)`: Returns stocks within a date range
- Use streams to filter stocks by date
- Consider how to handle inclusive date ranges
- `getUniqueSymbols()`: Returns a list of unique stock symbols
- Use streams to get distinct symbols
- Consider how to handle empty lists
3. **Implement Analysis Methods**
- `getDailyPriceChanges()`: Calculates the difference between consecutive days' closing prices
- Group stocks by symbol first
- Sort by date within each group
- Calculate differences between consecutive prices
- Look at the UML for implementation expectations
:::info
๐ง **Stream Operation Hints**
- For filtering stocks by symbol or date range, use `filter()`
- For getting unique symbols, use `distinct()`
- For daily changes, use `groupingBy()` to separate by symbol, then process each group
- Remember that Stream operations don't modify the original data
:::
:::warning
๐ง **Common Pitfalls**
- Remember that Stream operations don't modify the original data
- Terminal operations consume the Stream - you can't reuse it
- Some operations might need to handle empty Streams carefully
- Be careful with parallel streams - they might not always be faster
- Watch out for stateful operations in parallel streams
:::
### Test Specifications
The following test class defines the expected behavior of your implementation. You can place this file in your `src/test/java/jam08` package:
```java
/* *****************************************
* CSCI 205 - Software Engineering and Design
* Spring 2025
*
* Name: Lily Romano
*
* Project: csci205_jams
* Package: jam08
* Class: StockModelTest
*
* Description:
*
* ****************************************
*/
package jam08;
import static org.junit.jupiter.api.Assertions.*;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.PrintWriter;
import java.math.BigDecimal;
import java.time.LocalDate;
import java.util.List;
import java.util.Map;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
/**
* Tests for the StockModel class that handles stock data management.
*/
public class StockModelTest {
private StockModel model;
private static final String TEST_FILE = "test_stocks.csv";
/**
* Sets up a new StockModel instance and creates test data before each test.
*/
@BeforeEach
void setUp() {
model = new StockModel();
// Create test data file
createTestDataFile();
}
/**
* Tests loading stock data from a CSV file.
*/
@Test
void testLoadData() throws IOException {
model.loadData(TEST_FILE);
assertFalse(model.getStocks().isEmpty());
assertEquals(6, model.getStocks().size());
}
/**
* Tests retrieving stocks by their symbol.
*/
@Test
void testGetStocksBySymbol() throws IOException {
model.loadData(TEST_FILE);
List<Stock> appleStocks = model.getStocksBySymbol("AAPL");
assertFalse(appleStocks.isEmpty());
assertTrue(appleStocks.stream()
.allMatch(s -> s.getSymbol().equals("AAPL")));
}
/**
* Tests retrieving stocks within a specified date range.
*/
@Test
void testGetStocksByDateRange() throws IOException {
model.loadData(TEST_FILE);
LocalDate start = LocalDate.parse("2024-01-01");
LocalDate end = LocalDate.parse("2024-01-02");
List<Stock> stocks = model.getStocksByDateRange(start, end);
assertFalse(stocks.isEmpty());
assertTrue(stocks.stream()
.allMatch(s -> !s.getDate().isBefore(start) && !s.getDate().isAfter(end)));
}
/**
* Tests retrieving unique stock symbols from the loaded data.
*/
@Test
void testGetUniqueSymbols() throws IOException {
model.loadData(TEST_FILE);
List<String> symbols = model.getUniqueSymbols();
assertEquals(3, symbols.size()); // AAPL, GOOGL, MSFT
assertTrue(symbols.contains("AAPL"));
assertTrue(symbols.contains("GOOGL"));
assertTrue(symbols.contains("MSFT"));
}
/**
* Tests calculating daily price changes for stocks.
*/
@Test
void testGetDailyPriceChanges() throws IOException {
model.loadData(TEST_FILE);
Map<String, List<Map.Entry<LocalDate, BigDecimal>>> changes = model.getDailyPriceChanges();
// Check that we have keys for all stocks
assertEquals(3, changes.size());
// Check AAPL changes
List<Map.Entry<LocalDate, BigDecimal>> aaplChanges = changes.get("AAPL");
assertEquals(3, aaplChanges.size()); // null + 1 change
assertNull(aaplChanges.get(0)); // First day has no change
Map.Entry<LocalDate, BigDecimal> aaplChange = aaplChanges.get(1);
assertEquals(LocalDate.parse("2024-01-02"), aaplChange.getKey());
assertEquals(new BigDecimal("0.75"), aaplChange.getValue()); // 185.50 - 184.75
// Check GOOGL changes
List<Map.Entry<LocalDate, BigDecimal>> googlChanges = changes.get("GOOGL");
assertEquals(2, googlChanges.size()); // null + 1 change
assertNull(googlChanges.get(0)); // First day has no change
Map.Entry<LocalDate, BigDecimal> googlChange = googlChanges.get(1);
assertEquals(LocalDate.parse("2024-01-02"), googlChange.getKey());
assertEquals(new BigDecimal("1.50"), googlChange.getValue()); // 143.25 - 141.75
// Check MSFT has no changes (only one day of data)
List<Map.Entry<LocalDate, BigDecimal>> msftChanges = changes.get("MSFT");
assertEquals(1, msftChanges.size());
assertNull(msftChanges.getFirst()); // Only one day, so no change
}
/**
* Tests handling of non-existent file when loading data.
*/
@Test
void testInvalidFile() {
assertThrows(FileNotFoundException.class, () ->
model.loadData("nonexistent.csv"));
}
/**
* Tests handling of invalid data format in CSV file.
*/
@Test
void testInvalidDataFormat() throws IOException {
// Create a file with invalid format (missing header)
try (PrintWriter writer = new PrintWriter("invalid_format.csv")) {
writer.println("2024-01-01,AAPL,180.50,185.00,180.00,184.75,1000000");
}
assertThrows(IllegalArgumentException.class, () ->
model.loadData("invalid_format.csv"));
}
/**
* Creates a test CSV file with sample stock data.
*/
private void createTestDataFile() {
// Create a test CSV file with sample data
try (PrintWriter writer = new PrintWriter(TEST_FILE)) {
writer.println("Date,Symbol,Open,High,Low,Close,Volume");
writer.println("2024-01-01,AAPL,180.50,185.00,180.00,184.75,1000000");
writer.println("2024-01-02,AAPL,184.75,186.25,183.50,185.50,1200000");
writer.println("2024-01-03,AAPL,182.75,184.25,181.50,183.50,1100000");
writer.println("2024-01-01,GOOGL,140.25,142.00,140.00,141.75,800000");
writer.println("2024-01-02,GOOGL,141.75,143.50,141.50,143.25,900000");
writer.println("2024-01-02,MSFT,375.00,378.50,374.00,377.75,1500000");
} catch (IOException e) {
fail("Could not create test file");
}
}
}
```
> ๐ **Checkpoints**
>
>After completing each part, verify:
>
>**Part 1 - Stock Class**:
>
>- [ ] Class matches the provided UML
>
>**Part 2 - StockDataLoader**:
>
>- [ ] Class matches the provided UML
>
>**Part 3 - StockModel**:
>
>- [ ] Class matches the provided UML
>- [ ] All tests pass
## Save Your Work - Exercise 1
Verify what files are uncommited:
```bash
git status
```
Stage your changes:
```bash
git add .idea/ src/main/java/jam08/Stock.java src/main/java/jam08/StockDataLoader.java src/main/java/jam08/StockModel.java src/test/java/jam08/StockModelTest.java src/test/resources/jam08/invalid_format.csv src/test/resources/jam08/test_stocks.csv
```
Commit your work:
```bash
git commit -m "jam08: Implement stock data model and processing"
```