Evacuation Route Optimization and Forecasting¶
Introduction¶
This notebook is developed to model and forecast the movement of individuals and families during an emergency evacuation. The primary aim is to manage and optimize the flow of evacuees through a network of reception points, ensuring effective use of limited resources and processing capacities at each site over a four-day period.
Problem Statement¶
During a crisis in a country, a large group of individuals and families need to evacuate safely. The evacuation involves routing through three primary reception points initially. Evacuees will move through several intermediate stops before reaching final destinations. Each location in the network has defined daily processing capacities and resources like food, shelter, and sanitation. The challenge is to effectively forecast daily arrivals at primary locations and manage the network logistics.
Objective¶
The objectives of this analysis are:
- Simulate the daily flow of evacuees through the network.
- Forecast the load and resource utilization at each site for the next four days.
- Identify and mitigate bottlenecks in the evacuation routes.
- Suggest optimal routing and resource allocation strategies.
# Setup Environment
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import networkx as nx
from scipy.optimize import minimize
import random
# Ensure plots are displayed inline in the Jupyter notebook.
%matplotlib inline
Data Preparation¶
Network Construction¶
This section details the creation of the evacuation network. Each node represents a location (primary, intermediate, or final), and edges represent possible evacuation routes between these locations. Attributes like capacity, distance, and travel time are associated with nodes and edges respectively.
# Import necessary libraries for network operations and visualization
import networkx as nx
import matplotlib.pyplot as plt
# Create a directed graph to model the evacuation routes
G = nx.DiGraph()
# Define nodes with their attributes (capacity here could relate to the number of people each location can process per day)
nodes = {
'P1': {'type': 'Primary', 'capacity': 2500},
'P2': {'type': 'Primary', 'capacity': 2500},
'P3': {'type': 'Primary', 'capacity': 2500},
'I1': {'type': 'Intermediate', 'capacity': 500},
'I2': {'type': 'Intermediate', 'capacity': 500},
'I3': {'type': 'Intermediate', 'capacity': 500},
'I4': {'type': 'Intermediate', 'capacity': 500},
'I5': {'type': 'Intermediate', 'capacity': 500},
'I6': {'type': 'Intermediate', 'capacity': 500},
'F1': {'type': 'Final', 'capacity': 15000},
'F2': {'type': 'Final', 'capacity': 15000},
'F3': {'type': 'Final', 'capacity': 15000}
}
# Add nodes to the graph
G.add_nodes_from(nodes.items())
# Define sets of primary, intermediate, and final nodes based on their attributes in the graph
primary_nodes = [node for node, attr in G.nodes(data=True) if attr['type'] == 'Primary']
intermediate_nodes = [node for node, attr in G.nodes(data=True) if attr['type'] == 'Intermediate']
final_nodes = [node for node, attr in G.nodes(data=True) if attr['type'] == 'Final']
# Define edges with their attributes (distance in kilometers and travel time in hours)
# Redefine edges to ensure each primary node connects to at least two intermediates and each intermediate to at least two finals
# Initialize the edges list
edges = []
# Function to generate a random distance with a base and variability
def generate_random_distance(base, variability):
return base + random.randint(0, variability)
# Connect each primary to every intermediate with randomized distances
base_distance_pi = 100 # Base distance from primary to intermediate
variability_pi = 50 # Variability in distance
base_time_pi = 1 # Base time in hours
time_increment_pi = 0.1 # Time increment per node
for p in primary_nodes:
for index, i in enumerate(intermediate_nodes):
distance = generate_random_distance(base_distance_pi + 10 * index, variability_pi)
time = base_time_pi + time_increment_pi * index
edges.append((p, i, {'distance': distance, 'time': time}))
# Connect each intermediate to each final with randomized distances
base_distance_if = 200 # Base distance from intermediate to final
variability_if = 50 # Variability in distance
base_time_if = 2 # Base time in hours
time_increment_if = 0.1 # Time increment per node
for i in intermediate_nodes:
for index, f in enumerate(final_nodes):
distance = generate_random_distance(base_distance_if + 10 * index, variability_if)
time = base_time_if + time_increment_if * index
edges.append((i, f, {'distance': distance, 'time': time}))
# Print edges to verify the output
for edge in edges:
print(edge)
resource_usage_per_person = 1
# Clear any existing edges and add updated ones
G.clear_edges()
G.add_edges_from(edges)
('P1', 'I1', {'distance': 122, 'time': 1.0}) ('P1', 'I2', {'distance': 150, 'time': 1.1}) ('P1', 'I3', {'distance': 165, 'time': 1.2}) ('P1', 'I4', {'distance': 173, 'time': 1.3}) ('P1', 'I5', {'distance': 178, 'time': 1.4}) ('P1', 'I6', {'distance': 161, 'time': 1.5}) ('P2', 'I1', {'distance': 138, 'time': 1.0}) ('P2', 'I2', {'distance': 117, 'time': 1.1}) ('P2', 'I3', {'distance': 152, 'time': 1.2}) ('P2', 'I4', {'distance': 133, 'time': 1.3}) ('P2', 'I5', {'distance': 160, 'time': 1.4}) ('P2', 'I6', {'distance': 160, 'time': 1.5}) ('P3', 'I1', {'distance': 131, 'time': 1.0}) ('P3', 'I2', {'distance': 141, 'time': 1.1}) ('P3', 'I3', {'distance': 143, 'time': 1.2}) ('P3', 'I4', {'distance': 147, 'time': 1.3}) ('P3', 'I5', {'distance': 150, 'time': 1.4}) ('P3', 'I6', {'distance': 198, 'time': 1.5}) ('I1', 'F1', {'distance': 220, 'time': 2.0}) ('I1', 'F2', {'distance': 218, 'time': 2.1}) ('I1', 'F3', {'distance': 261, 'time': 2.2}) ('I2', 'F1', {'distance': 239, 'time': 2.0}) ('I2', 'F2', {'distance': 231, 'time': 2.1}) ('I2', 'F3', {'distance': 270, 'time': 2.2}) ('I3', 'F1', {'distance': 221, 'time': 2.0}) ('I3', 'F2', {'distance': 238, 'time': 2.1}) ('I3', 'F3', {'distance': 258, 'time': 2.2}) ('I4', 'F1', {'distance': 226, 'time': 2.0}) ('I4', 'F2', {'distance': 212, 'time': 2.1}) ('I4', 'F3', {'distance': 258, 'time': 2.2}) ('I5', 'F1', {'distance': 220, 'time': 2.0}) ('I5', 'F2', {'distance': 257, 'time': 2.1}) ('I5', 'F3', {'distance': 263, 'time': 2.2}) ('I6', 'F1', {'distance': 214, 'time': 2.0}) ('I6', 'F2', {'distance': 240, 'time': 2.1}) ('I6', 'F3', {'distance': 238, 'time': 2.2})
Visualization of the Network¶
Visualizing the network helps in understanding the connectivity and flow capacity between locations. This visualization will aid in identifying potential bottlenecks and planning efficient evacuation routes.
# Manually set the positions for each type of node
pos = {
'P1': (0, 4), 'P2': (0, 0), 'P3': (0, -4), # Primary nodes
'I1': (1, 9), 'I2': (1, 6), 'I3': (1, 3), 'I4': (1, -3), 'I5': (1, -6), 'I6': (1, -9), # Intermediate nodes
'F1': (2, 4), 'F2': (2, 0), 'F3': (2, -4) # Final nodes
}
# Draw the network
nx.draw(G, pos, with_labels=True, node_color='lightblue', node_size=5000, edge_color='gray', linewidths=1, font_size=12)
edge_labels = dict(((u, v), f"Dist: {d['distance']}km, Time: {d['time']}hr") for u, v, d in G.edges(data=True))
nx.draw_networkx_edge_labels(G, pos, edge_labels=edge_labels)
# Display the plot
plt.title('Evacuation Network')
plt.show()
Simulation Model Setup¶
Model Parameters¶
In this section, we define the parameters for our evacuation simulation. We introduce randomization in the daily arrival rates at each primary location to account for variability and uncertainty. This simulation will run over a period of four days, and we will analyze how the network handles fluctuating inflows of evacuees each day.
Randomized Arrival Rates¶
Arrival rates at the primary locations are modeled using a normal distribution to incorporate natural variability in the number of evacuees arriving daily. We set a mean and standard deviation for each primary node to generate these rates. This approach helps in understanding the robustness of the evacuation plan under different scenarios.
# Simulation parameters
# Assume that arrivals at primary nodes can vary each day
# Define a mean and a standard deviation for arrivals at each primary node
arrival_means = {'P1': 200, 'P2': 200, 'P3': 200}
arrival_stds = {'P1': 50, 'P2': 50, 'P3': 50}
# Simulation days
num_days = 4
# Generate random arrivals for each primary node over 4 days
random_arrivals = {node: np.random.normal(loc=mean, scale=std, size=num_days).astype(int)
for node, mean, std in zip(arrival_means.keys(), arrival_means.values(), arrival_stds.values())}
# Print the randomized arrival rates for inspection
for node, arrivals in random_arrivals.items():
print(f"Random arrivals at {node} over {num_days} days: {arrivals}")
total_arrivals =np.sum(list(random_arrivals.values()))
print(f"Total arrivals: {total_arrivals}")
# Visualize the arrivals
fig, ax = plt.subplots()
for node, arrivals in random_arrivals.items():
ax.plot(arrivals, label=f"{node} arrivals")
ax.set_xlabel('Day')
ax.set_ylabel('Number of Arrivals')
ax.set_title('Randomized Arrival Rates at Primary Locations')
ax.legend()
plt.show()
Random arrivals at P1 over 4 days: [229 221 122 247] Random arrivals at P2 over 4 days: [231 242 153 255] Random arrivals at P3 over 4 days: [176 251 225 285] Total arrivals: 2637
Optimization Model Setup¶
Goals of the Optimization Model¶
The optimization model is designed to achieve the following objectives:
- Minimize Total Evacuation Time: Reduce the duration evacuees spend traveling through the network to reach safety.
- Maximize Safety and Comfort: Ensure adequate resources like bed spaces and food are available for all evacuees.
- Balance Load Across the Network: Distribute evacuee flow to prevent any node from being overwhelmed, ensuring efficient use of all paths and resources.
Model Development Steps¶
Step 1: Define Decision Variables¶
x_{ijk}
: Represents the number of evacuees moving from nodei
to nodej
on dayk
.
Step 2: Set Up Objective Function¶
- The objective is to minimize the total travel time or maximize comfort by optimal resource allocation.
Step 3: Define Constraints¶
- Capacity Constraints: No node exceeds its processing or holding capacity.
- Flow Conservation: Ensure all evacuees accounted for in transfers between nodes.
- Resource Availability: Resources at each node must meet the needs of evacuees.
Implementation in Python¶
We will use linear programming to solve this optimization model, employing the PuLP library for problem formulation and solving.
Mathematical Formulation of the Evacuation Optimization Problem¶
The objective of this optimization model is to effectively manage the evacuation of individuals from a crisis-stricken area through a network of nodes (representing locations) connected by edges (representing possible paths). The model aims to minimize the total evacuation time while ensuring that all evacuees are safely transferred from primary entry points to final exit locations.
Sets and Indices¶
- (i, j): indices for nodes in the network, where (i) is the start node and (j) is the end node.
- (k): index representing days in the simulation period.
Parameters¶
- (T_{ij}): travel time between nodes (i) and (j).
- (C_i): capacity of node (i), representing the maximum number of evacuees it can handle per day.
- (S_i): supply at primary node (i) (number of evacuees available for evacuation).
- (D_j): demand at final node (j) (number of evacuees required to reach final destinations).
Decision Variables¶
- (x_{ijk}): number of evacuees moving from node (i) to node (j) on day (k).
Objective Function¶
Minimize Total Travel Time: [ \text{Minimize} \sum_{i,j,k} x_{ijk} \cdot T_{ij} ]
Constraints¶
Node Capacity Constraints: [ \sum_{j} x_{ijk} \leq C_i \quad \forall i, k ] This ensures that the number of evacuees processed at each node does not exceed its capacity.
Flow Conservation at Intermediate Nodes: [ \sum_{h} x_{hik} = \sum_{j} x_{ijk} \quad \forall i \in \text{intermediate nodes}, k ] Evacuees entering an intermediate node must equal those leaving, ensuring continuity of flow.
Supply Constraints at Primary Nodes: [ \sum_{j} x_{ijk} = S_i \quad \forall i \in \text{primary nodes}, k ] The total outflow from each primary node must match the initial supply of evacuees.
Demand Fulfillment at Final Nodes: [ \sum_{h} x_{hjk} \geq D_j \quad \forall j \in \text{final nodes}, k ] Each final node must receive at least the number of evacuees as dictated by its demand.
Inflow Equals Outflow: [ \sum_{i,j,k} x_{ijk} \text{ entering primary nodes} = \sum_{i,j,k} x_{ijk} \text{ leaving final nodes} ] Ensures that all evacuees entering the network are accounted for at the exit points.
Solution Strategy¶
The model is solved using a linear programming solver which finds the optimal flow of evacuees across the network that minimizes the total travel time while satisfying all operational constraints.
import pulp as pl
# Initialize the optimization problem
model = pl.LpProblem("Evacuation_Optimization", pl.LpMinimize)
# Define the decision variables
x = pl.LpVariable.dicts("evacuee_flow",
((i, j, k) for i in nodes for j in nodes for k in range(num_days) if (i, j) in G.edges()),
lowBound=0, cat='Integer')
# Objective Function: Minimize the sum of travel times
model += pl.lpSum([x[(i, j, k)] * G.edges[i, j]['time'] for i, j, k in x.keys()])
# Define dynamic supply and demand based on the total supply at primary nodes
total_supply = sum(random_arrivals[node][k] for node in primary_nodes for k in range(num_days))
demand = {node: (G.nodes[node]['capacity'] / sum(G.nodes[n]['capacity'] for n in final_nodes)) * total_supply for node in final_nodes}
# Add constraints
for k in range(num_days):
for node in nodes:
# Capacity constraint at each node
model += pl.lpSum([x[(node, j, k)] for j in G.successors(node) if (node, j, k) in x]) <= G.nodes[node]['capacity'], f"Capacity_day_{k}_node_{node}"
# Flow conservation for intermediate nodes
if node in intermediate_nodes:
model += pl.lpSum([x[(h, node, k)] for h in G.predecessors(node) if (h, node, k) in x]) == \
pl.lpSum([x[(node, j, k)] for j in G.successors(node) if (node, j, k) in x]), f"Flow_conservation_{node}_{k}"
# Demand fulfillment at final nodes
if node in final_nodes:
model += pl.lpSum([x[(h, node, k)] for h in G.predecessors(node) if (h, node, k) in x]) >= demand[node], f"Demand_{node}_{k}"
# Supply provision at primary nodes
if node in primary_nodes:
model += pl.lpSum([x[(node, j, k)] for j in G.successors(node) if (node, j, k) in x]) == sum(random_arrivals[node][k] for k in range(num_days)), f"Supply_{node}_{k}"
# Define the percentage of inflow that must reach outflow
required_flow_percentage = 0.8
# Ensure total inflow at primary equals total outflow at final nodes
total_inflow_primary = pl.lpSum([x[(i, j, k)] for i in primary_nodes for j in G.successors(i) for k in range(num_days)])
total_outflow_final = pl.lpSum([x[(h, j, k)] for j in final_nodes for h in G.predecessors(j) for k in range(num_days)])
model += (total_inflow_primary >= total_outflow_final*required_flow_percentage, "RelaxedInflowOutflowBalance")
# Solve the model
status = model.solve()
print(f"Status: {pl.LpStatus[status]}")
# Output results
positive_flow = False
for v in model.variables():
if v.varValue > 0:
positive_flow = True
print(f"{v.name} = {v.varValue}")
if not positive_flow:
print("No positive flows found. Review constraints and capacities.")
Welcome to the CBC MILP Solver Version: 2.10.3 Build Date: Dec 15 2019 command line - /Users/larry/Documents/TheWigingtonGroup/DataScience/.venv/lib/python3.12/site-packages/pulp/solverdir/cbc/osx/64/cbc /var/folders/40/n3tpby6j5bb34_vjdntsyzg40000gn/T/3b12a7d0ed96437aaf1a023e206eb03c-pulp.mps -timeMode elapsed -branch -printingOptions all -solution /var/folders/40/n3tpby6j5bb34_vjdntsyzg40000gn/T/3b12a7d0ed96437aaf1a023e206eb03c-pulp.sol (default strategy 1) At line 2 NAME MODEL At line 3 ROWS At line 102 COLUMNS At line 1111 RHS At line 1209 BOUNDS At line 1354 ENDATA Problem MODEL has 97 rows, 144 columns and 576 elements Coin0008I MODEL read with 0 errors Option for timeMode changed from cpu to elapsed Continuous objective value is 34972.8 - 0.00 seconds Cgl0004I processed model has 73 rows, 144 columns (144 integer (0 of which binary)) and 504 elements Cutoff increment increased from 1e-05 to 0.0999 Cbc0012I Integer solution of 34972.8 found by DiveCoefficient after 0 iterations and 0 nodes (0.01 seconds) Cbc0001I Search completed - best objective 34972.8, took 0 iterations and 0 nodes (0.01 seconds) Cbc0035I Maximum depth 0, 0 variables fixed on reduced cost Cuts at root node changed objective from 34972.8 to 34972.8 Probing was tried 0 times and created 0 cuts of which 0 were active after adding rounds of cuts (0.000 seconds) Gomory was tried 0 times and created 0 cuts of which 0 were active after adding rounds of cuts (0.000 seconds) Knapsack was tried 0 times and created 0 cuts of which 0 were active after adding rounds of cuts (0.000 seconds) Clique was tried 0 times and created 0 cuts of which 0 were active after adding rounds of cuts (0.000 seconds) MixedIntegerRounding2 was tried 0 times and created 0 cuts of which 0 were active after adding rounds of cuts (0.000 seconds) FlowCover was tried 0 times and created 0 cuts of which 0 were active after adding rounds of cuts (0.000 seconds) TwoMirCuts was tried 0 times and created 0 cuts of which 0 were active after adding rounds of cuts (0.000 seconds) ZeroHalf was tried 0 times and created 0 cuts of which 0 were active after adding rounds of cuts (0.000 seconds) Result - Optimal solution found Objective value: 34972.80000000 Enumerated nodes: 0 Total iterations: 0 Time (CPU seconds): 0.00 Time (Wallclock seconds): 0.01 Option for printingOptions changed from normal to all Total time (CPU seconds): 0.01 (Wallclock seconds): 0.02 Status: Optimal evacuee_flow_('I1',_'F1',_1) = 500.0 evacuee_flow_('I1',_'F2',_0) = 500.0 evacuee_flow_('I1',_'F2',_2) = 258.0 evacuee_flow_('I1',_'F2',_3) = 258.0 evacuee_flow_('I1',_'F3',_2) = 242.0 evacuee_flow_('I1',_'F3',_3) = 242.0 evacuee_flow_('I2',_'F1',_0) = 121.0 evacuee_flow_('I2',_'F1',_2) = 500.0 evacuee_flow_('I2',_'F2',_0) = 379.0 evacuee_flow_('I2',_'F3',_1) = 500.0 evacuee_flow_('I2',_'F3',_3) = 500.0 evacuee_flow_('I3',_'F1',_0) = 500.0 evacuee_flow_('I3',_'F1',_1) = 379.0 evacuee_flow_('I3',_'F2',_3) = 500.0 evacuee_flow_('I3',_'F3',_1) = 121.0 evacuee_flow_('I3',_'F3',_2) = 500.0 evacuee_flow_('I4',_'F1',_0) = 121.0 evacuee_flow_('I4',_'F1',_2) = 379.0 evacuee_flow_('I4',_'F1',_3) = 500.0 evacuee_flow_('I4',_'F2',_1) = 500.0 evacuee_flow_('I4',_'F2',_2) = 121.0 evacuee_flow_('I4',_'F3',_0) = 379.0 evacuee_flow_('I5',_'F1',_3) = 379.0 evacuee_flow_('I5',_'F2',_1) = 242.0 evacuee_flow_('I5',_'F2',_2) = 500.0 evacuee_flow_('I5',_'F2',_3) = 121.0 evacuee_flow_('I5',_'F3',_0) = 500.0 evacuee_flow_('I5',_'F3',_1) = 258.0 evacuee_flow_('I6',_'F1',_0) = 137.0 evacuee_flow_('I6',_'F2',_1) = 137.0 evacuee_flow_('I6',_'F3',_2) = 137.0 evacuee_flow_('I6',_'F3',_3) = 137.0 evacuee_flow_('P1',_'I1',_2) = 182.0 evacuee_flow_('P1',_'I1',_3) = 182.0 evacuee_flow_('P1',_'I2',_1) = 500.0 evacuee_flow_('P1',_'I3',_0) = 63.0 evacuee_flow_('P1',_'I3',_1) = 119.0 evacuee_flow_('P1',_'I3',_2) = 500.0 evacuee_flow_('P1',_'I4',_0) = 119.0 evacuee_flow_('P1',_'I4',_3) = 500.0 evacuee_flow_('P1',_'I5',_0) = 500.0 evacuee_flow_('P1',_'I5',_1) = 63.0 evacuee_flow_('P1',_'I6',_0) = 137.0 evacuee_flow_('P1',_'I6',_1) = 137.0 evacuee_flow_('P1',_'I6',_2) = 137.0 evacuee_flow_('P1',_'I6',_3) = 137.0 evacuee_flow_('P2',_'I1',_0) = 500.0 evacuee_flow_('P2',_'I1',_1) = 500.0 evacuee_flow_('P2',_'I1',_2) = 318.0 evacuee_flow_('P2',_'I2',_2) = 63.0 evacuee_flow_('P2',_'I2',_3) = 500.0 evacuee_flow_('P2',_'I3',_1) = 381.0 evacuee_flow_('P2',_'I3',_3) = 381.0 evacuee_flow_('P2',_'I4',_0) = 381.0 evacuee_flow_('P2',_'I4',_2) = 500.0 evacuee_flow_('P3',_'I1',_3) = 318.0 evacuee_flow_('P3',_'I2',_0) = 500.0 evacuee_flow_('P3',_'I2',_2) = 437.0 evacuee_flow_('P3',_'I3',_0) = 437.0 evacuee_flow_('P3',_'I3',_3) = 119.0 evacuee_flow_('P3',_'I4',_1) = 500.0 evacuee_flow_('P3',_'I5',_1) = 437.0 evacuee_flow_('P3',_'I5',_2) = 500.0 evacuee_flow_('P3',_'I5',_3) = 500.0
# Assuming 'model' is your PuLP model and it's been solved
if model.status == pl.LpStatusOptimal:
# Summing inflow to final nodes and outflow from primary nodes
total_inflow_primary = sum(x[(i, j, k)].varValue
for i in primary_nodes
for j in G.successors(i)
for k in range(num_days)
if x[(i, j, k)].varValue > 0)
total_outflow_final = sum(x[(i, j, k)].varValue
for j in final_nodes
for i in G.predecessors(j)
for k in range(num_days)
if x[(i, j, k)].varValue > 0)
print("Total Inflow to Primary Nodes:", total_inflow_primary)
print("Total Outflow from Final Nodes:", total_outflow_final)
else:
print("Model did not solve optimally. Current Status:", pl.LpStatus[model.status])
Total Inflow to Primary Nodes: 10548.0 Total Outflow from Final Nodes: 10548.0
Results Analysis¶
Overview¶
Upon solving the optimization model, the results provide insights into the optimal flow of evacuees through the network. This section will analyze the flow patterns, resource utilization, and potential bottlenecks within the evacuation routes.
Key Metrics¶
- Total Evacuation Time: The cumulative time taken for all evacuees to reach safety.
- Resource Utilization: Examination of how resources at each node are utilized relative to their capacity.
- Node Throughput: The number of evacuees processed at each node per day.
Visualizations¶
Visual representations of the results are crucial for quickly understanding the flow dynamics and assessing the efficiency of the evacuation routes.
Flow Visualization¶
We will plot the number of evacuees moving through each node for a visual representation of the network load.
Resource Utilization Chart¶
Charts displaying the percentage of resource utilization at each node will help identify where resources may be insufficient or overly stressed.
Bottleneck Identification¶
Highlighting nodes that consistently operate at or near capacity will help in planning improvements to the evacuation process.
# Assuming the model has been solved and flow_results is filled
flow_results = {} # Example: {('P1', 'I1', 0): 100, ('I1', 'F1', 0): 100, ...}
for var in model.variables():
if var.varValue > 0:
parts = var.name.split('_')
i, j, day = parts[2].strip(")('")[:2], parts[3].strip(")(''")[:2], int(parts[4].strip(')')) # Strip potential parentheses and convert
flow_results[(i, j, day)] = var.varValue
# Prepare the visualization for each day
for day in range(num_days):
G_viz = G.copy()
# Set edge attributes for flows
for (i, j, d), flow in flow_results.items():
if d == day and (i, j) in G_viz.edges:
G_viz.edges[i, j]['flow'] = flow
# Draw the network graph
plt.figure(figsize=(12, 8))
edge_colors = ['green' if 'flow' in data else 'gray' for _, _, data in G_viz.edges(data=True)]
edge_labels = {(u, v): f"{d['flow']} evacuees" for u, v, d in G_viz.edges(data=True) if 'flow' in d}
nx.draw(G_viz, pos, with_labels=True, node_color='skyblue', node_size=5000, edge_color=edge_colors, width=2)
nx.draw_networkx_edge_labels(G_viz, pos, edge_labels=edge_labels)
plt.title(f'Evacuation Flows Across Network on Day {day+1}')
plt.show()
Conclusion¶
The evacuation optimization model has effectively orchestrated the evacuation process, achieving a minimized sum of 36,268.8 traveled hours by evacuees. This critical metric illustrates the cumulative time spent by evacuees in transit, highlighting the model’s efficiency in minimizing travel durations within the constraints of node capacities and system resources. By ensuring that at least 80% of the initial inflow reaches final destinations, the model demonstrates its robust capacity to handle significant evacuee volumes while effectively reducing the burden on evacuees by decreasing their time in transit.
The capacity restriction at intermediate nodes, set at 500, has played a pivotal role in regulating throughput and maintaining a balance between efficient routing and resource utilization. While these nodes have managed to accommodate designated flows without exceeding their thresholds, the total traveled hours metric prompts a further examination into how travel times can be even further reduced. Potential enhancements could involve increasing the capacity at critical bottlenecks or optimizing path selections to streamline flows, thereby decreasing the total hours evacuees spend traveling and enhancing overall evacuation responsiveness.
This analysis not only validates the operational strategies currently employed but also underscores areas where targeted improvements can lead to significant gains in evacuation efficiency. Future model refinements aimed at reducing traveled hours could include advanced routing algorithms, increased node capacities, or improved resource deployment strategies. Such enhancements would aim to provide quicker, more comfortable evacuations, ultimately leading to a more effective response in emergency situations.