Human Activity Recognition (HAR), using machine learning to identify times spent (for example) walking, sitting, and standing, is widely used in health and wellness wearable devices, in ambient assistant living devices, and in rehabilitation. In this paper, a stacked Long Short-Term Memory (LSTM) structure is designed for HAR to be implemented on a smartphone. The use of an edge device for the processing means that the raw collected data does not need to be passed to the cloud for processing, mitigating potential bandwidth, power consumption, and privacy concerns. Our offline prototype model achieves 92.8% classification accuracy when classifying 6 activities using a public dataset. Quantization techniques are shown to reduce the model's weight representations to achieve a >30x model size reduction for improved use on a smartphone. The end result is an on-phone HAR model with accuracy of 92.7% and a memory footprint of 27 KB.