Until the advent of deep learning, a key task in machine learning was feature engineering, that is, constructing a space of raw data that would allow a learning algorithm to identify patterns. We see similar 'hand-crafted' representations in physics: for instance, the entanglement spectra often reveals phase transitions. Deep architectures in machine learning automated the extraction of representation, and tensor networks fulfil a similar role in many-body physics. Results proving equivalence between the two paradigms are beginning to emerge. In this talk, we present work-in-progress results on the correspondence between hierarchical tensor networks and deep learning architectures.