# Nena's project ###### tags: `bachelors projects` `tba` --- ### Owners (the only one with the permission to edit the main text, others can comment) Nena, Laura, Marjolein --- # Monte Carlo simulations of hard anisotropic (triangular prism and square pyramid) particles ## Project description Over the last number of years it has become possible to synthesis colloids into nearly any shape and with nearly any interactions imaginable. An intriguing feature of this impressive development is that particles can exhibit strong anisotropy with effect their self-assembly. In particular, in addition to translational degrees of freedom, this gives rise to rotational degrees of freedom playing a key role in the self-assembly process. In this project, we will explore the role of rotational and translational anisotropy in glasses and nucleation, and explore how they couple or decouple and how it plays a role on the self-assembly. To this end, we will develop and use Monte Carlo computer simulations of hard, anisotropic particles with square pyramid and triangular prism shapes. ## Hard spheres I started out writing a simulation for hard spheres. First in NVT, then NVT with the use of a cell list, then NPT and finally NPT with a cell list. For the NVT, only qualitative comparison to the theory (the spheres may not overlap) was possible, but for the NPT ensemble, we could fit the obtained data to the Carnahan-Starling equation of state. The following plot was obtained (50.000 MC steps,$\Delta = 0.07$, $\Delta_V = 0.22$, 108 particles, FCC starting configuration, packing fraction from average volume of last 2000 steps). ![](https://i.imgur.com/ZputcJA.png) ![](https://i.imgur.com/YOH9eJP.png) We see that our data points correspond quite okay to the Carnahan-Starling equation of state, especially when we take into account the system size. We see that in the last MC steps, the volume has equiliberated quite okay, there are no large volume changes anymore. ## Hard cubes We now tried to do the same thing, but for hard cubes. This brings the challenge of having to deal with more complicated overlap checks, since this cannot be easily computed by taking the distance between two particles anymore. To deal with this, I have implemented the Separating Axis Theorem. This states that if there is one axis along which the particles do not overlap if you project them on it, the particles do not overlap at all. Also, it gives a number of axes to check, so that not an infinite amount of axes needs checking. The axes that need to be checked in 3D are: * The normals of the faces of particle 1 * The normals of the faces of particle 2 * The cross products between all edges of particle 1 and particle 2 Also, since spheres are fully rotationally symmetric, we did not need to take rotations into account. However, this is not the case for cubes, so besides translational and volume moves, we need to implement rotational moves. We do this using the Rodrigues rotation matrix. First, we choose a random vector around which we are going to rotate our particle, and after that, we choose a random rotation angle. The new rotation matrix is then given by $R = I +\sin(\theta) K+(1-\cos(\theta))^2 K$, with $K=\begin{pmatrix}0&-u_z&u_y\\u_z&0&-u_x\\-u_y&u_x&0 \end{pmatrix}$ and $u_x, u_y, u_z$ the x-, y- and z-components respectively of the randomly chosen unit vector. Then, every Monte Carlo step, we do, on average, $n_{part}$ translational moves, $n_{part}$ rotational moves and one volume change. ![](https://i.imgur.com/0bLaCKJ.png) ![](https://i.imgur.com/0IDNfr8.png) The plots above were obtained (10.000 MC steps, $\Delta=0.2$, $\Delta_V= 0.22$, $\Delta_{\theta}=1/2\pi$, 512 particles). Again, the packing fraction was calculated using the average volume over the last 2000 steps. We see that the data from this simulation matches the reference data quite well. **Warning!** Later on, some errors were found, when I tried to simulate triangular prisms with this code. The errors caused overlap to occur in some situations, checking the snapshots of the cubes, this seems not to occur here (see the snapshot below) (this can be explained by noting that the cubes have more symmetry, and this symmetry is absent for the triangular prisms, and this caused the problems with accepting moves that lead to overlap), but I need to run this simulation again to make sure all data is correct. ![](https://i.imgur.com/D3RK5j5.png) ## Triangular prisms Now, the same is done, but for triangular prisms. The expansion branch of the equation of state is obtained by running the simulation for 500.000 MC steps for pressures in the range 5-100, with steps of 0.5. The value of $\Delta$ is determined in 10 trial MC moves, where it is adjusted, such that the acceptance rate of translation is in between 0.3 and 0.4. **Beware:** 10 MC trials is not a lot, maybe I should run the simulations again, but with more MC trail moves. If we fit the obtained data to the equation of state, the following plot is obtained: ![](https://i.imgur.com/H4J1PRW.png) For this plot, the last 1000 data points are averaged to obtain the packing fraction. **Beware:** it is better to check the volume development over the number of steps, in order to estimate the number of steps over which the packing fraction should be averaged. Taking a look at this volume development, we obtain the following plots: ![](https://i.imgur.com/s9tisjR.png) ![](https://i.imgur.com/fiEDYbZ.png) If we average the packing fraction over the last 100.000 MC steps, we obtain the following fit to the equation of state: ![](https://i.imgur.com/3CjFxo6.png) It looks very similar to the plot obtained before. However, in the plot of the volume development, especially the one with $\Beta_P = 23$, we see that the system is not yet very equilibirated (there are not yet multiple 'waves' to see in the volume, over the last steps, on average it is still very much changing). This is why we decided to carry out the same simulation, but now with 700.000 Monte Carlo steps. This gives the following equation of state: With volume development plots that look like this: In this figure, the so-called expansion branch is plotted. This means that we started out with a crystal phase with about the equilibrium packing fraction and then this crystal phase is melted/stays crystal, depending on the pressure. However, we would also like to obtain some insight into the compression branch. In this branch, one starts out with a fluid configuration (the last snapshot of the highest melting pressure in the fluid branch of the expansion branch), and then for various pressures the system is simulated. But, for this system to crystallize, a nucleation barrier needs to be overcome. This causes the system to form a crystal at higher pressures than in the expansion branch, so that we expect a nice hysterisis loop to form. But, because of the difficulty forming a crystal phase, we also expect the simulation to need more time to equilibirate. That is why we ran the data points of the compression branch for 100.000 MC steps. This resulted in the following equation of state: ![](https://i.imgur.com/POroajS.png) Here, we obsere a kink in the green data points, which are the data points of the compression branch. This kink can either indicate that there is some intermediate state present, that is in between the fluid and crystal phase, or it can indicate that the system has not fully equilibirated yet. Some volume development plots of the compression branch looked like this: ![](https://i.imgur.com/KzVs8aY.png) Here, we see that for $\beta_P = 42$ the volume is still slowly decreasing. That suggests that the system has not yet fully equilibirated. To learn more about what is going on, we will run the simulations longer for the data points with pressures larger than or equal to $\beta_P = 38$. We chose for $2*10^6$ MC steps. This resulted in the following equation of state: ![](https://i.imgur.com/zS9Mz4M.png) Here we see more clearly that indeed for some pressure, the system jumps from fluid-like to more crystal-like. We will later on use order parameters to quantify the degree of fluidity/crystallinity of the system. The reason why the points in the crystal part of the compression branch don't fully reach the crystal branch of the expansion branch is probably because it is not so easy for the disordered system to reach the state that has perfectly aligned particles throughout the whole system, it is more likely that it consists of several crystalline domains, but that slightly decreases the packing fraction. Observe that there are some points missing at pressures The volume development plots now look like this: INSERT VOLDEV PLOTS And taking a look at the configurations itself INSERT OPENGL DINGEN + UITLEG ## On order parameters We need to find ways to quantify the degree of 'fluidity'/'crystallinity' of the system. For this, we can make use of order parameters. We have tested 4 different bond order parameters, for symmetries up and until 12-fold. Also, we have tested 2 different orientational order parameters. For all these order parameters, we have calculated the order parameters for each particle (local), for 2 snapshots: one from the fluid regime ($\beta_P = 16.5$) and one from the crystal regime ($\beta_P = 61.5$). Then, we have plotted the (non-normalized) probability distributions for all order parameters. The results are shown below (from 'muchdataThor' folder): ![](https://i.imgur.com/tI5E36Q.png) ![](https://i.imgur.com/mCLYfy8.png) ![](https://i.imgur.com/aEzhOlM.png) ![](https://i.imgur.com/9Nxy9e0.png) ![](https://i.imgur.com/RwNtUbw.png) The various bond order parameters are defined as follows: * bops: We have chosen to use: # Bond order parameters of initial configuration (used: Quench/0.45/1) By quenching a fluid of packing fraction 0.45, we obtained, after 544.000 MC steps, a configuration of packing fraction 0.60. If we take a look at the various bond order parameters of this configuration, we obtain the following plots: ![](https://i.imgur.com/xmdfye8.png) ![](https://i.imgur.com/jlAyITT.png) ![](https://i.imgur.com/aQH0pRH.png) ![](https://i.imgur.com/bzODnpN.png) ![](https://i.imgur.com/nlX1fcg.png) We observe that the bond and orientational order parameters of this configuration agree quite well with the fluid order parameters, so that we can assume that this configuration is fluid enough to use as a starting configuration for the nucleation simulations. ## Decoupled volume changes Up until now, we have only considered volume changes that compress/expand the volume in all directions equally. However, since our particles are anisotropic, it is not unlikely that the favoured distance between particles is not the same in the z-direction (which is the direction of the long axis of the particle in the starting crystal configuration) as in the xy-plane (the plane in which the triangles lie in the crystal starting configuration). Therefore, we test if different results are obtained if one compresses the x and y directions separately from the z-direction. This is done by implementing that in the case of a volume change, the x and y box dimensions are changed with probability 1/2, and the box length in the z direction is also changed with probability 1/2. This adjustment is especially relevant for the crystal regime, because there it is important that the system has the freedom to arange in such the a way that the closest packing can be obtained, and thus that it has freedom in the direction of the long axes of the particles and in the direciton of the plane in which the anisotropic faces of the particles lie separately. The following results were obtained: ![](https://i.imgur.com/hfez4p2.png) ![](https://i.imgur.com/DX9JEqH.png) ![](https://i.imgur.com/sahNzNa.png) In the first plot, the red dots indicate the final packing fractions for each pressure, when decoupled volume changes are implemented, whereas the orange dots are the data points for the coupled volume changes. We observe that the results are slightly different, but that the differences are not major. In the second plot, the ratio of the box length in the x and in the y direction is plotted as a function of the number of Monte Carlo steps. We see that this ratio does not vary over the Monte Carlo steps, which is as expected, because the x and y box dimension are not changed separately. In the third plot however, the ratio of the z and x box dimensions is plotted. This does vary over the number of Monte Carlo steps, since the dimension of the xy-plane and the z-direction are scaled separately. From this plot, the ideal ratio between the x and z dimensions of the box can be derived (so that eventually we can check if this ratio is similar in the coupled volume change case, because then the results are expected to be similar). I can do this for all pressures, and take a look at the final ratio between the length of the box in the z- and x-direction. For this, I used the data of EOS2/decoupledcrystal2, and averaged the dimensions of the box over the last 300.000 MC steps (of the total of 700.000 MC steps). The result is given in the plot below: ![](https://i.imgur.com/h1BSsKt.png) The same thing can be done, but then using the unit cell dimensions. We assume that also in the fluid, the number of particles in x, y and z directions are the same as in the initial configuration. The initial configuration consists of 28 (x) * 16 (y) * 24 (z) particles. Dividing the box dimensions by the number of particles in each direction gives the dimensions of a unit cell in each direction. If we plot the results of that, the following plot is obtained: ![](https://i.imgur.com/T3mVI7d.png) ## Colouring particles ## Long simulations ## Nucleation simulations On Hera, Marjolein has run 10 runs, for different packing fractions: [0.51, 0.53, 0.55, 0.57, 0.59]. Each has run $3*10^ 7$ Monte Carlo steps and every $2*10^4$ steps it writes the coordinates of all particles to a file. It appeared that packing fraction 0.51 was the most usable, since here the nucleation did not happen too fast. Still, after $5*10^6$ MC steps, nucleation was clearly visible, so the simulation was stopped. ## Radial distribution function In order to compare the cutoff radius determined by SANN with the spatial distribution of particles, I plotted the radial distribution function for both the fluid and crystal phases. For the fluid phase, I have used the first 80 snapshots of run 1 at packing fraction 0.51 (dataspontaneous/run2/0p51/). For both crystal and fluid, I considered 1000 bins, that divided the box dimension that was smallest. Note that then some particles are not taken into account, but that is not such a problem, considering our system contains 10752 particles. The crystal configuration was obtained by running NVT on a crystal configuration of packing fraction 0.59 with the z/x-ratio in box dimension: $\frac{18.557614}{15.749482} = 1.178$. While writing this, I realised this ratio is not right. It is based on earlier results for the ideal ratio between unit cell dimensions, and not box dimension sizes. So I should re-run the NVT simulations of a crystal at packing fracktion 0.59 with the right box dimension ratio. But, the results for this run are (I took the last 80 conf of dataspontaneous/crystalthermalnoise/0.59/coord, these are also in folder radDist): ![](https://i.imgur.com/w7745bk.png) Observe that for the fluid phase, the first minimum is at $r=0.9$. If we take a look at the results of SANN for a fluid configuration, it determines the cutoff radius at about $r = 1.0-1.1$. There is a small difference there, SANNs cutoff radius lies in between the first minimum and second maximum, but this is not such a problem, since the difference is small, and $g(r)$ is not as neat as for spherical particles, because of the particle anisotropy. For the crystal phase, the first minimum is at about $r = 0.97$, and SANNs cutoff radius is about 1.05. That means that SANNs cutoff radius is once again a little bit smaller, but this is not a big issue, because of the reasons mentioned before. Taking a further look at SANN and its cutoff radii, it would be nice to know what the distances of each particle to its neighbors are. Knowing the size of our particles, this can be calculated: ![](https://i.imgur.com/EVtXvWQ.jpg) The closest distances are: * $\frac{1}{\sqrt{3}}$ for the green particles * $\frac{1}{\sqrt{3}}$ for the particles directly above/below * $\sqrt{\frac{2}{3}}$ for the green particles, but one layer above/below * 1 for the pink particles * $\frac{2\sqrt{3}}{3}$ for the orange particles. So SANN usually gives particles 1,2,3 and the particles directly above/below as neighbors, and sometimes particles 1,2,3 that are one layer above/below. ## Re-run of crystal $\eta = 0.59$ for thermal noise As noted earlier, a mistake was made in the box dimensions of the crystal with thermal noise. So we ran the NVT simulation on a crystal of packing fraction 0.59 again, to obtain a crystal with thermal noise. I used the ## Classification of particles In order to gain insight in the fluid-/crystalness of particles, we can colour them, depending on their values of various order parameters. We will look into d3 (dotsav 3: this is more of a positional order parameter) and S2 (more of an orientational order parameter). Beware, some very sloppy notation follows: $\xi$ indicates the cutoff value of the number of solid-like connections (usually this is denoted $\xi$ and the cutoff value $\xi_c$). ### D3 If we take a look at the histograms (Marjolein has made them, STILL NEED TO MAKE THEM MYSELF) of D3 for solid and crystal, we observe two peaks, one at +1, and one at -1. We would like to know what particles give rise to $D_3 = +1$ and what particles to $D_3 = -1$. For this analysis I took two particles from the last snapshot of crystalthermalnoise (a crystal configuration of packing fraction 0.59, that has run over NVT for 29500 MC steps). I chose for particles 9 and 27, because these had only D3 values close to +1 and -1. Colouring the neighboring particles according to their value of D3: * particle 9 ![](https://i.imgur.com/9KzEAF8.png) * paritcle 27 ![](https://i.imgur.com/RpoyNnf.png) Here, the green particles have $D_3(i,j) = +1$ and the blue ones have $D_3(i,j) = -1$. The red particle is the particle itself. We see that the particles with $D_3 = +1$ are the particles directly above or below the particles (neighboring on the triangular face side), whereas all other neighbors have $D_3(i,j)=-1$. ### S2 Firstly, I have plotted for a crystal configuration and a fluid configuration a histogram of the values of S2 for each particle. For the fluid configuration, I took the first snapshot of run 1 (dataspontaneous/run2/0p51/1/coord/coords_step00000000.poly) and for the crystal the last snapshot of the crystal configuration discussed earlier (with thermal noise) (dataspontaneous/crystalthermalnoisenvt/0.59/coord/coords_step0029500.poly). The value of S2 for each particle with its neighbors is given by $S_2(i,j) = \frac{3}{2}\vec{u_i}\cdot\vec{u_j} - \frac{1}{2}$. The results are given in the following plot: ![](https://i.imgur.com/fClAzrV.png) We see that the graphs for solid and fluid configurations intersect at about 0.75, so we expect that if we define the S2 cutoff for being a solid-like connection to be somewhere around 0.75, we can accurately determine the solidness of a specific particle (and thus see if any clusters are formed). After determining the number of solid-like connections of a particle, it is useful to see how many solid-like connections per particle are typical for a fluid and a crystal. If we take $S2_{cutoff} = 0.75$ (so a connection is said to be solid-like if $S_2(i,j) \geq 0.75$), the following plot is obtained (again using the same snapshots): ![](https://i.imgur.com/ZmPtkAP.png) Observe that the graphs now intersect somewhere between 5 and 6 solid-like connections. So if we take the cutoff to be 5 or 6, we expect to be able to quite accurately determine the crystal-ness of particles. Now, I took a look at different cutoff values and how various snapshots would then be classified. An overview of screenshots of the snapshots can be seen below. The large red particles are classified as solid-like, the little blue ones as liquid-like. $xi$ is here the cutoff value for the number of solid-like connections to be classified as solid (so if (number of solid-like connections) > $\xi$, then a particle is solid-like). $S_{2, cutoff}$ is the value of $S_2$ for a particle and its neighbor that is minimally needed to be classified as a solid-like connection. ![](https://i.imgur.com/XT5VyQN.png) ![](https://i.imgur.com/pcZJ6qk.png) ![](https://i.imgur.com/wvqRAK9.png) It can be seen that, as expected, higher values of $\xi$ lead to less particles being classified as solid-like. This makes sense, since for a higher value of $\xi$ a higher number of solid-like connections is needed to be classified as a solid, so less particles will be classified as so. We furthermore observe that the influence of $S_{2, cutoff}$ for these values is much smaller. The difference between $S_{2, cutoff} = 0.73$ and $0.80$ for the same value of $\xi$ is quite small. I can conclude that a value of $S_{2, cutoff} = 0.75$ is suitable. Now, what I can do is set this value for $S_{2, cutoff}$ and colour the particles depending on whether they would be classified as solid for different values of $\xi$. Doing this for the first, last and 02000000 step respectively gives: ![](https://i.imgur.com/i1ZuPL1.png) ![](https://i.imgur.com/wO4AA4g.png) ![](https://i.imgur.com/kgN3zTA.png) Here, particles are coloured: * red: $\xi \geq 7$ * green: $\xi \geq 6$ * dark blue: $\xi \geq 5$ * yellow: $\xi \geq 4$ * light blue if for none of these values they are classified as solid. We can do the same for the D3 order parameter. This gives the following results for respectively the first, last and 02000000 snapshot: ![](https://i.imgur.com/ikG7bZi.png) ![](https://i.imgur.com/zjLuSGb.png) ![](https://i.imgur.com/rmvhPEC.png) The colouring scheme is the same as before. From the plots, we can conclude that for S2 $\xi = 6$ is an appropriate value, and for D3 $\xi = 5$. This, because we don't want too many particles to be classified as solid, but also not too little, and these values seem appropriate if we take a look at the snapshots and the different particles that are coloured and whether they look solid-like or not. Note that up until now, if we classified snapshots with D3, we used $D_{3, cutoff} = 0.70$ and $\xi = 4$, so that these pick up more solid-like particles than it should, after this analysis. The last thing we can do now is compare the results of D3 and S2 classifications. Again for the first, last and 02000000 snapshot: ![](https://i.imgur.com/BaQfST9.png) ![](https://i.imgur.com/BY3VwXU.png) ![](https://i.imgur.com/wjxVDDF.png) Here, the particles are coloured: * green if both S2 and D3 classify them as solid-like * yellow if only S2 classifies it as solid-like * dark blue if only D3 classifies it as solid-like * small and light blue if none of them classifies it as solid-like. Here we used cutoff values $S_{2, cutoff} =0.75$, $D_{3, cutoff} = 0.70$ and $\xi = 6$ for both D3 and S2. We once again observe that S2 picks up more than D3. ## Mean Square Displacement For run 1 at packing fraction 0.51 (TriangularPrism/dataspontaneous/run2/0.51/1), I have plotted the Mean Square displacement for the first $1.2*10^6$ steps. After this, nucleation starts to become visible, so that the system cannot be considered fully fluid anymore. It resulted in the following plot: ![](https://i.imgur.com/e7Q0kYn.png) It can be seen that the trend is very linear, which is exactly as expected from a fluid. To generate the plot, I calculated the square displacement $\Delta r^2$ for each particle with all preceding data files. Then for each $\Delta \text{Number of MC steps}$, I divided the sum of the $\Delta r^2$'s by the number of files that have this difference in MC steps and the number of particles. This gave the Mean Square Displacement for each $\Delta \text{Number of MC steps}$. Therefore, the points for lower $\Delta \text{Number of MC steps}$ are the average of more mean square displacements than the ones at higher $\Delta \text{Number of MC steps}$. Doing a linear fit on this data, according to the function $y = 6Dt+C$ where D and C are fitted, yields: ![](https://i.imgur.com/jEi8IFi.png) Here, the fit is done only for the first 80 files in run2/0p51/1, because these files are for sure fluid, and we want to know the diffusion coefficient in the fluid phase. ## Packing fraction 0.505 Now, Marjolein has run 10 runs for packing fraction 0.505. Also, she did this for packing fraction 0.507, but since we saw that the last snapshots of 0.505 were nucleated, it makes more sense to analyse these, because the lower the packing fraction, the slower the nucleation process, and that is what we want, because then we can study it in more detail. For run 1, the results look promising (see gif: dataspontaneous/run1/0p505/1/nucleation gif). Also, this is the last snaphot: ![](https://i.imgur.com/CpzYEZe.png) So we see that indeed is has nucleated, that is very nice! Interesting things to analyse now, are the size of the biggest cluster in the system, and the center of mass, for each snapshot. So this is what I did. ### Center of mass and cluster size For each snapshot, the center of mass was calculated (in code get_nucleus_size.c (Documents/bops)). This was done by taking the biggest cluster, and calculating the center of mass of the particles that are in it (keeping periodic boundary conditions in mind). This data was written to a file (bigclussize, in the same folder as the d3_classification (run1/0p505/1/coord_clas_d3_withclusterdata)) (NOTE: this cluster size is determined by the d3 classification, it does not take S2 classification into account). If we plot the results over time (python: run1/0p505/1/clusterplot.py), this is the result: ![](https://i.imgur.com/J1qle4J.png) Here, the color of the scatters gives the (normalised step number). So the red dots are the last snapshots, and the dark blue ones the first. We see that as expected the center of mass converges. Also, by eye, it converges to the center of the big nucleus that we see in the last snapshot. (I checked that the center of mass calculation is right, since for the first snapshot, it is easy to find the biggest cluster, containing 4 particles, and its apparent coordinates compy with the calculated center of mass) Also, I plotted the size of the biggest cluster in each snapshot agains the number of snapshots (using the same python code, that uses the same data file as input (coordinates of COM are in same file as biggest cluster size)). The results are: ![](https://i.imgur.com/khPMCzR.png) So indeed we see that we start with almost no clustering, and then later on the cluster size starts increasing fast. Something that might be interesting is from what point the biggest cluster is really the cluster that starts growing. To get a little better feeling for the evolution of the x, y and z coordinates of the center of mass over time, we can plot the coordinates seperately in a 2D plot. The results are given below (used the same data ![](https://i.imgur.com/0a10iA8.png) This suggests that nucleation takes place at around Monte Carlo step 164x20.000 = 3.260.000. However, we see that earlier already more often than randomly, the center of mass is placed at arount the same location, indicating that already before there are signs of nucleation. This starts at around x = 132, so MC step 2.640.000. Taking a look at the snapshots, we see that before 2.640.000 at that location there is not really a nucleus present, then at step 2.920.000 it seems to disappear, and then appears largely again at step 3.000.000. Indication these two points in time in the previous plots: ![](https://i.imgur.com/Ba2hdbe.png) ![](https://i.imgur.com/R2MQ7Vw.png) The snapshots in and around this interval: can be seen from my computer, uploading 53 screenshots here would clutter it somewhat and I did not figure out yet how to upload GIFs here. ### Local order parameters #### run 1 Something that is interesting to see now, is how the local order parameters (D3, S2 and the local packing fraction) change over time in a region close to where nucleation takes place. What I did for a first insight, was to take (10.5, 10, 10.5) as the center of mass, and look within a sphere of radius 1 around that point. The bops are calculated using local-bops-nucleation.c (Documents/TriangularPrism/dataspontaneous/run1/0p505/1). It uses data_local_bops as input, in this folder are all coordinate files and a folder caller voro with all voro_vol files. Plotting the results using localbopsplot.py that is in the same folder gives: ![](https://i.imgur.com/ZhtAOzT.png) Now, this is very interesting! The red and blue vertical lines are at the same MC steps as before. We see that indeed, first nucleation happens around the blue line. Then later on, nucleation seems to disappear, but no neat nucleation seems to happen again after the red line, whereas we did have indications from the previous plots that this did happen. My suggestion would be that the center of mass for the nucleation after the red line is different (also in the COM plot, we see that the coordinates are slightly different than before, and since we have not such a big region in which we look, the nucleation might not be in the sphere we look in yet). This might also explain why the second nucleation in this plot appears slower, there is a slow increase in order parameters, until nucleation seems to have happened, whereas the first event happens very suddenly. This might be explained by the fact that nucleation happens outside the sphere and then the crystal slowly starts penetrating the sphere and thereby slowly increasing the order parameters. This seems like a difficult nucleation, since a nucleus forms, disappears, and later on a nucleus forms nearby, but not at exactly the same position, but also the order parameters do not go down to initial values, so that it is not a nice small nucleus slowly starting to grow. That is why it would maybe be useful to look at other runs. #### run 2 (in run 1/0p505) Something interesting, but for our purposes not so nice, happens in run 2. 2 nuclei form, and they start competing over being the largest. Plotting the same things as before, using the same code, but now in the folder of 2: ![](https://i.imgur.com/ulePfQ4.png) ![](https://i.imgur.com/7XttTzJ.png) ![](https://i.imgur.com/N0tbZW1.png) We see jumps in cluster size and COM coordinate, indicating that we might have two competing nuclei. Looking at the snapshots, it very much looks like that is indeed the case. Sad, because that is not really the case that we would like to investigate. So also the local BOPS are not futher looked into. Happily, we have 10 runs, so let's move on to run 3! #### run 3 The plots for run 3 (dataspontaneous/run1/0p505/3): ![](https://i.imgur.com/wOgVfn1.png) ![](https://i.imgur.com/oCeWd07.png) ![](https://i.imgur.com/Jg5YC5t.png) Again with a vertical line that gives an indication of where nucleation happens, especially nice to be able to compare the different plots, because it gives some sort of reference point. Taking COM = (12.7, 11.6, 5.1) and again a sphere of radius 1. Calculating the local order parameters: ![](https://i.imgur.com/bJv7KGB.png) Here we see that S2 starts increasing more early and faster than D3 and the local packing fraction. ### Cluster size versus time for all runs As before, we can plot the size of the biggest cluster versus the number of Monte Carlo steps. This gives: | ![](https://i.imgur.com/VrkbPr9.jpg) | ![](https://i.imgur.com/xNzHuvJ.jpg) | | ------------------------------------ | ------------------------------------ | | ![](https://i.imgur.com/35OfsfA.jpg) | ![](https://i.imgur.com/iAzjz0Y.jpg) | | ![](https://i.imgur.com/n5udBU6.jpg) | ![](https://i.imgur.com/mNAr8Zd.jpg) | | ![](https://i.imgur.com/osRBaPv.jpg)| ![](https://i.imgur.com/5Gum9jd.jpg) |![](https://i.imgur.com/dIfHgbn.jpg) |![](https://i.imgur.com/fKTEBis.jpg) Here, we see that run 6 has not yet nucleated. ### Autocorrelation functions To know how correlated the system is over time, it is useful to know the correlation functions of all the order parameters (the Ten Wolde BOPs ($q_l$), S2, P6 and local packing fraction). The faster these autocorrelation functions go to zero, the faster a system loses information about previous configurations. The results are given in the plots below (normalized autocorrelation functions: $\text{ACF}(t) = \frac{\frac{1}{N}\frac{1}{\text{number of }t'}\sum_{i=1}^{N}\sum_{t'}(x(t')-\bar{x})(x(t'+t)-\bar{x})}{\text{ACF}(0)}$): ![](https://i.imgur.com/7UG0QNf.png) ![](https://i.imgur.com/UgY8Reu.png) ![](https://i.imgur.com/ZrzOMIl.png) These are calculated for the first 80 snapshots of run2/0p51/1, using the autocorrelation.c code (in run2/0p51). For S2, we know from this plot that it is almost fully decorrelated after 200.000 MC steps, and for the q's and local packing fraction after 150.000 MC steps. P6 is not looked at for now, it drops off really quickly. The average values for the order parameters used here are: * $q_1=0.123452$ * $q_2 = 0.140365$ * $q_3 = 0.209812$ * $q_4 = 0.251241$ * $q_5 = 0.293906$ * $q_6 = 0.322450$ * $q_7 = 0.330558$ * $q_8 = 0.330258$ * $q_9 = 0.322498$ * $q_{10} = 0.311245$ * $q_{11} = 0.301127$ * $q_{12} = 0.297430$ * $S_2 = 0.069487$ * $P_6 = 0.279823$ * Packing fraction $= 0.514508$. These are calculated by taking the average values of all the files. ### New simulations! Using the data on diffusion time (150.000 MC steps) and decorrelation time (150.000-200.000) of the system, Marjolein now started running new simulations. In these simulations, the system configuration is written every 5000 MC steps, for $6*10^6$ MC steps in total (the first data file is overwritten with the last data file as soon as $6*10^6$ is reached). The simulation is stopped when the biggest nucleus reaches size 300. This is ran for the same seed and starting configurations as the 10 runs at packing 0.505, so that we get the same results but more detailed, and for an additional 20 runs (different starting configuration). ### Run 6 There is a positive thing about the fact that run 6 does not nucleate, namely that we have very many snapshots of the fluid phase, that we can use to, for example, calculate the averages of the $q$'s, S2, P6 and local packing fraction more accurately. So this is what we do. Plotting the biggest cluster size and center of mass versus the number of MC steps for run 6: ![](https://i.imgur.com/8WbIvnO.png) ![](https://i.imgur.com/CrIP51H.png) So we see that run 6 has nucleated after all! To calculate the autocorrelation function for run 6, we us the first 1250 files, because after MC step $2.5*10^7$, a large nucleus forms. The resulting autocorrelation functions: ![](https://i.imgur.com/SesdnwN.png) ![](https://i.imgur.com/5VC8gAt.png) With averages: * $q_1=0.123339$ * $q_2 = 0.139683$ * $q_3 = 0.207437$ * $q_4 = 0.250460$ * $q_5 = 0.294208$ * $q_6 = 0.322718$ * $q_7 = 0.330167$ * $q_8 = 0.329309$ * $q_9 = 0.321581$ * $q_{10} = 0.310648$ * $q_{11} = 0.300874$ * $q_{12} = 0.297268$ * $S_2 = 0.061324$ * $P_6 = nan $ - fixed, run again to obtain value * Packing fraction $= 0.509448$ * Volume per unit cell $= 0.495050$ * d3 (not autocorrelation calculated) = 0.349841 . We calculated the average volume of a unit cell, because the difference between the average local packing fraction and the global packing fraction was quite large in the earlier autocorrelation function. We check with the volume per unit cell whether the calculations are carried out correctly, since this value should match the value obtained by dividing the volume of the box by the number of particles. This is the case, so everything is right. ## Local bops in frequent runs Another simulation was run. Now, every 5000 MC steps, an output file was created, up to a total of 1200 files. The simulation was automatically stopped when the largest cluster size exceeded a cutoff value (350 - check in code). If this value was not reached after writing 1200 data files, the first data files were overwritten. With this data, we can calculate for each snapshot the values of all bops for each particle. Then, determining the center of mass (redid this, to make it suit the periodic BC, see wikipedia) from the nucleus that starts growing in the end, and defining a sphere of radius 1.5 around this center of mass (such that 30-40 particles are in it), we can evaluate the bops in this sphere. The plots for all runs can be obtained with the python script (dataspontaneous/run1_freq/0p505_new/localbopsplotnice_allruns.py). In these plots, the nice q's were chosen, from the plots for all q's that were obtained with (run1_freq/0p505/9/localbopsplot.py). One of the very nice plots is shown below: ![](https://i.imgur.com/m3FUIIk.jpg) For this plot, the Steinhardt bops (bops) were used. Another option would be to use the Lechner averaged bops (bopsav). This gives the following results: ![](https://i.imgur.com/xbqYKEA.png) It can be seen that S2 and D3 are similar (as expected, because they stay the same). However, for the q's, the noise in the fluid is reduced for the Lechner bops. Also, q5 has whole different behaviour (for Lechner it increased when going to crystal, for Steinhardt it decreases). This is in agreement with the histograms plotted before though. Interesting. Also, the autocorrelation function for Lechner and Dellago bops can be calculated: ![](https://i.imgur.com/dN8Itpl.png) These drop off as rapidly as the Steinhardt bops autocorrelation. Then finally, we take a look into our solid-ness classification. This, by setting $\xi_c=4$ and checking if the clustersize goes up earlier. The result is shown below (using localbopsplot read in clusterdataxi4 and plot clusersize). It can be seen that the clustersize with $\xi_c = 4$ does not significantly increase before the one with $\xi_c = 5$ does, so for the rest of this research, we will use $\xi_c = 5$. ![](https://i.imgur.com/ZrzGYEc.png) It makes sense that the graphs of the Lechner&Dellago bops look like this, because they sum before averaging, which causes minuses and plusses to cancel, giving less accurate results. When it comes to this, Emanuele Boattini's bops might be better. The results for these bops: ![](https://hackmd.io/_uploads/rksXThPEh.png) ![](https://hackmd.io/_uploads/SkY46hvV3.png) We see that the autocorrelation function for q3 falls off more slowly for these bops. These means that there is more q3 correlation between snapshots for these bopsav2 q3's. Also it can be seen that for the bops over MC steps plots, there is a little less noise. However, the differences are not so big that these are easier to study, so we choose for the rest of the research to use the normal bops, because these are more local. ## On correlations What we then wanted to know was whether there exists some correlation between high values of certain q's and others. So, whether particles with, for example, high q3 also are likely to have high S2. For this, we study the infrequent snapshots from run 6 (dataspontaneous/run1/0p505/6). The first 1200 snapshots are classified as fluid, according to the study of the size of the largest cluster over time and the number of solidlike particles over time: ![](https://hackmd.io/_uploads/HyuMyTvN2.png) Here we see that up and until $2.5*10^7$ MC steps (1250 files), no significant nucleation has happened. To give some room for precursors of nucleation, we take the first 1200 snapshots into account in the following analysis. The bops of these snapshots are put in a big list, and correlations are calculated. These can be seen in (Spearman correlations.xlsx) ![](https://i.imgur.com/DCZRHSz.png) The correlations turn out to not be very large. The maximum correlations are about 0.25, which is not really big. This confirms that the fluid is indeed quite disordered. To check whether there exists some correlation if for example only more crystal-like particles are taken into account, we perform the following analysis: we define a cutoff and only the particles with order parameter higher than the cutoff are taken into account. This is done for various cutoffs of S2 and d3. The results are in (correlation_S2_0p6.xlsx, for S2 cutoff of 0.6). The results are too extensive to show here, so they will be left out. Should be noted that correlation never exceeds 0.4, so still is quite low. To gain some more insight into those results, we can plot the correlation as a function of the cutoff. Below are the following corelations: * correlation with q3 for cutoff value of q3 ![](https://hackmd.io/_uploads/BJ1sMaw42.png) * correlation with S2 for cutoff value of q3 ![](https://hackmd.io/_uploads/HybjfTPV3.png) * correlation with q3 for cutoff value of S2 ![](https://hackmd.io/_uploads/H1kn76D4h.png) * correlation with S2 for cutoff value of s2 ![](https://hackmd.io/_uploads/SJd2m6vE2.png) Probably S2 cutoff of 0.9 is a little too high, so that only little particles remain and the correlations start doing things that fall out of the expecter behaviour based on what's before. Also, we can take a look at how the density distibution looks like depending on cutoffs: * packing fraction depending on different q3 cutoffs ![](https://hackmd.io/_uploads/BJz94pPNn.png) * packing fraction depending on different S2 cutoffs![](https://hackmd.io/_uploads/SkRnNTD43.png) * q3 depending on different q3 cutoffs![](https://hackmd.io/_uploads/ByNCVTv4h.png) * q3 depending on different s2 cutoffs![](https://hackmd.io/_uploads/r1d0N6vEn.png) * q3 depending on different packing cutoffs![](https://hackmd.io/_uploads/BkaAVaDNh.png) * S2 depending on different q3 cutoffs![](https://hackmd.io/_uploads/BySJHaw4h.png) * S2 depending on different s2 cutoffs![](https://hackmd.io/_uploads/ryc1S6D42.png) * s2 depending on different packing cutoffs![](https://hackmd.io/_uploads/S1ZlSavE3.png) It is interesting to study what happens here, although one should be very careful drawing conclusions from it. ## Local bops improved We now edited the plots for the local bops to make them a little more insightful. An example for run 29 is given below: ![](https://hackmd.io/_uploads/H1Vjd3eB3.png) In this plot, the horizontal lines indicate the averages of the order parameter that were obtained using the autocorrelation code. The center of mass and total cluster size are in a different plot now, to make this one a little smaller. The vertical line indicates the moment of nucleation, determined by looking at the moment where the order parameters go up. The grey area indicates 150.000 MC steps, this is what was determined as the number of steps it takes for the system to decorrelate. ## Local nucleation To further study the nucleation, we dig deeper into run 29. What we can do is analyse the particles that are in the center of the 'nucleation sphere' and see how their BOPS evolve over time. Also, we can plot the standard deviation of the BOPS of the particles in the sphere over time and see how this evolves. For the standard deviation, the following results were obtained: ![](https://hackmd.io/_uploads/rJ3s6hgH2.png) It is interesting to see that the d3 and S2 (and packing fraction) standard deviation go up sharply as soon as nucleation happens, whereas this does not happen for the bond order parameters. Now, to study the particles in the middle of the sphere, it is first important to know what particles are in the middle of the sphere. This has been determined, and the following particles were said to be in the middle: 2851, 4810, 6513, 6754, 8975, 10174, 10636. Those are the red particles in this figure (step 10770000): ![](https://hackmd.io/_uploads/BkLDR2lS2.png) ![](https://hackmd.io/_uploads/Hk9uA3lB3.png) Note that the sphere is chosen by eyeballing the center of mass from the plot, and thus it may be that nucleation does not really happen in the center of the sphere. However, some alignment can be seen for the red particles. Now, we analyse the order parameters for only these particles. This gives the following plot: ![](https://hackmd.io/_uploads/HkSmJTlS3.png) It can be seen that compared to the plot with the bops for all particles in the sphere, this one goes up a little faster at nucleation, for example for S2. However, the differences are not really large. Also, we can track the OPs of these individual particles over time. This gives the following results: ![](https://hackmd.io/_uploads/H1P3-6lrh.png) ![](https://hackmd.io/_uploads/BJD3-agrh.png) ![](https://hackmd.io/_uploads/Sy_hWpxH3.png) ![](https://hackmd.io/_uploads/HJdnWpeB2.png) ![](https://hackmd.io/_uploads/HyguhbplBh.png) ![](https://hackmd.io/_uploads/rkOh-pxB3.png) ![](https://hackmd.io/_uploads/BkYnWpgr2.png) ![](https://hackmd.io/_uploads/r1YhbTxHn.png) ![](https://hackmd.io/_uploads/SJYhbTerh.png) ![](https://hackmd.io/_uploads/HkF3Z6grn.png) ![](https://hackmd.io/_uploads/S1tnbpxB3.png) ![](https://hackmd.io/_uploads/SyF3WTlrh.png) ![](https://hackmd.io/_uploads/SythZaeS2.png) ![](https://hackmd.io/_uploads/HyF3baeB2.png) ![](https://hackmd.io/_uploads/ByKnW6lB3.png) This looks like some very abstract art, but actually it is quite cool to see that order parameters of individual particles go up so fast. For other particles, the order parameters do not go up that quickly, indicating that they are probably not yet part of the nucleus forming. After further investigation, it appears that in fact, these particles (for which the order parameter does not go up) migrate out of the sphere and probably do not make up part of the nucleus forming (in contrary to for example making up a boundary between crystal parts). Therefore, it is decided to remove these particles (2851, 4810, 10174 (they go out of the sphere after file 629, 608 and 643 respectively)) from the analysis. That are the particles in green; which can be seen to already sit quite on the edge of the sphere: ![](https://hackmd.io/_uploads/BJz7-lYHh.png) ![](https://hackmd.io/_uploads/BkfXZgYHn.png) The downside is that only 4 particles are left... These are the particles in red: ![](https://hackmd.io/_uploads/S1zNWgKr2.png) ![](https://hackmd.io/_uploads/Hyz4bxtH2.png) ![](https://hackmd.io/_uploads/rJfNZlFBn.png) The new plots are given below (with the less interesting plots left out, but can be found on computer in folder of run 29): ![](https://hackmd.io/_uploads/Syv6dltSn.png) ![](https://hackmd.io/_uploads/SkPT_xKr3.png) ![](https://hackmd.io/_uploads/SyDaOxKr2.png) ![](https://hackmd.io/_uploads/SJDpulKr2.png) ![](https://hackmd.io/_uploads/ryP6_lKHn.png) ![](https://hackmd.io/_uploads/BkDpuxKr3.png) ![](https://hackmd.io/_uploads/S1S-KlYSn.png) ## Look into templating & crystallites To study the process of templating and the way the largest nucleus in the system grows, we take a look into the different crystallites that make up the nucleus. For this, we take a look at all particles that are part of the largest nucleus in the system with at least one particle in the COM sphere. We do this for run 6, since it was seen by Marjolein that at the end of this run, clearly some crytallites can be distinguished in the largest nucleus. What I did to do this, was determine recursively which particles are in the same crystallite, by computing $S_2(i,j)$ for particles $i$ and $j$ and if this value is larger than 0.8, they are in the same crystallite. Then, we can colour the crystallites separately, and we obtain this for the last snapshot of run 6: ![](https://hackmd.io/_uploads/HJw4yE0rh.png) ![](https://hackmd.io/_uploads/rJOEkVRSh.png) Clearly, several distinct crystallites can be seen. That is nice! If we now plot the number of particles per crystallite over time, we obtain the following plot: ![](https://hackmd.io/_uploads/S1X31VRrn.png) And for the number of distinct crystallites over time: ![](https://hackmd.io/_uploads/rJUhJV0S2.png) These plots are very fun to look at, so I start to generate this data for all runs. I did realize a small error in the code, which caused single particle crystallites to be not picked up. I improved this in the code (and the results are in run6/crytallites_improved). This improved code has been used for all other runs. ## Orientations plot What I also tried to do was to plot the Euler angles of all particles is the largest nucleus with at least one particle in the COM sphere over time (for run 29). The following results were obtained: ![](https://hackmd.io/_uploads/B1v9lEAS2.png) ![](https://hackmd.io/_uploads/B1D9xVCH3.png) ![](https://hackmd.io/_uploads/rkv5eV0B2.png) And for the angle around the y axis and the angle around the x axis, with time as a color: ![](https://hackmd.io/_uploads/Hkv3g4Cr2.png) However, these results are not very insightful, so we do not look at them for too long. ## Autocorrelation functions Suddenly I started doubting my autocorrelation results, but then I checked everything and calculated the results in another way as well, and they turn out to be correct (pfieuw), so both autocorrelation.c and autocorrelation-new.c are correct codes. # To do (now) - [x] Format the localbops plots nicely: zoomed in, leave out COM, make sure vertical line correct, include horizontal line for fluid average, etc. - [x] Run 6 infrequent: plot # solidlike particles - [x] Spearman correlation between q's for run 6 infrequent (if not many particles are solid) - [x] Spearman correlation in (large) bins - [x] Look at particles in nucleus 29 (start), track these particles before, during, after nucleation. How do OP's evolve? Standard deviation q's for particles in sphere? - [x] process feedback on methods thesis - [x] Write hard cube section in thesis - [x] Write equation of state TriPrism section in thesis - [x] Write order parameter section in thesis - [x] Start analysing templating, by plotting the orientation of particles over time - [x] Write orientation in hackmd - [x] Analyse templating by looking for crystallite sizes - [ ] Take a look at the environments which have high q3 values - [x] plot the packing fraction histograms with a S2 cutoff - [x] write about method to calculate S2 in thesis - [x] Plot crystallites as a fraction # To Do (later) - [x] Edit hard cube code, correcting the errors that came up with the triangular prisms and generate new data - [x] Plot compression branch for the triangular prisms - [x] Run the simulation with more trial moves - [x] Determine the number of steps over which the packing fraction is averaged - [x] Include errorbars in equation of state - [x] Implement discoupled volume changes - [x] Write theory section of thesis - [x] Work out BOPS in HackMD - [ ] Check z/x ratio of crystal nvt configuration