Approach pre-trained deep learning models with caution
CRANK

commentsBy Cecelia Shao, Comet.mlHow many times have you run the following snippets:import torchvision.models as models inception = models.inception_v3(pretrained=True)orfrom keras.applications.inception_v3 import InceptionV3 base_model = InceptionV3(weights='imagenet', include_top=False)It seems like using these pre-trained models have become a new standard for industry best practices. After all, why wouldn’t you take advantage of a model that’s been trained on more data and compute than you could ever muster by yourself?See the discussion on Reddit and HackerNewsLong live pre-trained models! There are several substantial benefits to leveraging pre-trained models:super simple to incorporateachieve solid (same or even better) model performance quicklythere’s not as much labeled data requiredversatile uses cases from transfer learning, prediction, and feature extractionAdvances within the NLP space have also encouraged the use of pre-trained language models like GPT and GPT-2, Allen…

kdnuggets.com
Related Topics: Deep Learning Car Technology