No longer reserved for nerdy geeks, nowadays smartwatches have gain their popularities rapidly, and become one of the most desirable gadgets that the general public would like to own. However, such popularity also introduces potential vulnerability. Until now, the de facto solution to protect smartwatches are passwords, i.e. either PINs or Android Pattern Locks (APLs). Unfortunately, those types of passwords are not robust against various forms of attacks, such as shoulder surfing or touch/motion based side channel attacks. In this paper, we propose a novel authentication approach for smartwatches, which adds another layer of security on top of the traditional passwords by considering the unique motion signatures when different users input passwords on their watches. It uses a deep recurrent neural networks to analyse the subtle motion signals of password input, and distinguish the legitimate users from malicious impostors. Following a privacy-preserving manner, our proposed approach does not require users to upload their passcodes for model training but only the motion data and identity labels.Extensive experiments on large-scale datasets collected real-world show that the proposed approach outperforms the state-of-the-art significantly, even in the most challenging case where a user has multiple distinct passcodes.