Hi Folks,
We are doing some testing on the emacspeak mailing list virtual machine. In the worst case this testing might cause a reboot. We will of course try to avoid that outcome, but we wanted to give you all a heads up in case that does happen and you notice the system disappears for a few minutes. Again, we do not anticipate an unplanned reboot, but we wanted to let you know that it might happen. If you do notice the system is not responding for 10 minutes or more, please write to me directly. Â
That's it. But if you want to know more here is what is going on:
We originally brought up
emacspeak.org as a virtual system with one gig of ram. In 2021 we had some issues with the system reporting it was short on memory. At that time we doubled what it was allocated from one to two gigs of ram. This eliminated the issues we were seeing but had the side effect of doubling the monthly cost for the host. We have since tried a few things and we have reduced its memory footprint. Now we will be testing to see if the system will run happily again on just one gig of ram.
The plan is to create a ramdisk, and slowly increase its size monitoring for any issues. So over the next few weeks we will be running it with a reduced amount of available ram. If we can get to a simulated maximum of one gig of ram, then we will leave it there for a few weeks and continue to monitor.Â
The testing might itself cause some problems. If there are any, we should be able to free up the blocked memory. But there is a chance we may need to reboot the system to fully restore things. If so, we want to do that without much warning. There is also a very small chance the system could reboot itself. A reboot will cause it to revert to its full amount of ram. If either of these types of reboots happen, the system would be offline for just a few minutes.
If all goes well and the system is running happily when restricted to one gig of ram, then we will schedule a migration back to a one gig configuration. That will of course be announced here before it happens.Â
More when we know more,
Greg
Â
If you have questions about this archive or had problems using it,
please contact us.