3D ConvNet Demo运行

来源:互联网 发布:网络问诊平台 编辑:程序博客网 时间:2024/06/10 22:29

3D ConvNet1是由Du Tran等人在2015年提出的提取视频时间域和空间域特征的三维立体卷积神经网络。Demo运行参考了C3D User Guide。本文会提到一些运行demo的注意事项,以避免出现我个人遇到的问题。

Run Train 3D ConvNet Demo

Change directory to YOUR_C3D_HOME/examples/c3d_trian_ucf101/

  • Compute volume mean from list
    • run sh create_volume_mean.sh to compute the volume mean file.
  • Train your own network from scratch
    • run sh train_ucf101.sh to train, expect a couple days to finish.
    • when running this train_ucf101.sh file, I encountered following problem, while I change the data_dir to my own path and set the GPU device to #1, which is available:
I0921 21:17:04.632088 6032 video_data_layer.cpp:344] read video from /MY_DATA_PATH/v_JumpingJack_g25_c05/F0921 21:17:04.632129 6032 video_data_layer.cpp:346] Check failed: ReadImageSequenceToVolumeDatum(file_list_[id].c_str(), 1, label_list_[id], new_length, new_height, new_width, sampling_rate, &datum)*** Check failure stack trace: *** 0x7efd01251c7d google::LogMessage::Fail() 0x7efd01253b30 google::LogMessage::SendToLog()@ 0x7efd01251842 google::LogMessage::Flush()@ 0x7efd0125454e google::LogMessageFatal::~LogMessageFatal()@ 0x4b01fe caffe::VideoDataLayer<>::SetUp()@ 0x4458cb caffe::Net<>::Init()@ 0x446b75 caffe::Net<>::Net()@ 0x43412f caffe::Solver<>::Init()@ 0x43783b caffe::Solver<>::Solver()@ 0x40b826 main@ 0x7efcfcfd9a40 (unknown)@ 0x40e079 _startAborted (core dumped)0
  • Solution: as I compared my extracted frame files with the one in demo, I found that they are different in frame image extension. The frame image extracted by author is in JPG while mine is in JPEG, so I change them in to JPG.
  • With the problem above solved, I encountered a new problem:
sdy@sdy:~/Git/C3D/examples/c3d_train_ucf101$ sh train_ucf101.shI0108 11:25:15.496800  6760 train_net.cpp:26] Starting OptimizationI0108 11:25:15.496939  6760 solver.cpp:41] Creating training net.I0108 11:25:15.505625  6760 net.cpp:76] Creating Layer dataI0108 11:25:15.505667  6760 net.cpp:112] data -> dataI0108 11:25:15.505692  6760 net.cpp:112] data -> labelI0108 11:25:15.512724  6760 video_data_layer.cpp:283] Opening file ../c3d_finetuning/train_01.lstI0108 11:25:15.601454  6760 video_data_layer.cpp:312] Shuffling dataI0108 11:25:15.989821  6760 video_data_layer.cpp:317] A total of 107258 video chunks.I0108 11:25:15.989881  6760 video_data_layer.cpp:344] read video from /home/sdy/Git/C3D/data/UCF101/frames/v_JumpingJack_g25_c05/I0108 11:25:16.024459  6760 video_data_layer.cpp:365] output data size: 30,3,16,112,112I0108 11:25:16.024513  6760 video_data_layer.cpp:387] Loading mean file from ucf101_train_mean.binaryprotoF0108 11:25:16.024785  6760 blob.cpp:98] Check failed: data_ *** Check failure stack trace: ***    @     0x7fa6f45a8daa  (unknown)    @     0x7fa6f45a8ce4  (unknown)    @     0x7fa6f45a86e6  (unknown)    @     0x7fa6f45ab687  (unknown)    @           0x436159  caffe::Blob<>::mutable_cpu_data()    @           0x4371e6  caffe::Blob<>::FromProto()    @           0x4ce1c5  caffe::VideoDataLayer<>::SetUp()    @           0x45ea1f  caffe::Net<>::Init()    @           0x4602c0  caffe::Net<>::Net()    @           0x43aaee  caffe::Solver<>::Init()    @           0x43fb8a  caffe::Solver<>::Solver()    @           0x40b15f  main    @     0x7fa6f0a95f45  (unknown)    @           0x40de1e  (unknown)    @              (nil)  (unknown)Aborted (core dumped)
  • Solution:报错显示问题出在ucf101_train_mean.binaryproto文件上,于是重新运行create_volume_mean.sh,报错如下。参考github上相关问题答案,作者表示此类问题出在train list上,文件路径不正确或者帧文件起始编码不是从000001开始导致。因此,通过重新对比文件路径和list文件,发现有两个视频命名大小写与list中路径不同,如文件名是v_HandstandPushups_g01_c01但路径却是:/home/…/v_HandStandPushups_g01_c01/ 1 36,由此作相应修改。
sdy@sdy:~/Git/C3D/examples/c3d_train_ucf101$ sh create_volume_mean.shI0108 15:20:14.439486  8571 compute_volume_mean_from_list.cpp:53] using dropping rate 10I0108 15:20:14.479281  8571 compute_volume_mean_from_list.cpp:80] Starting IterationF0108 15:20:54.331308  8571 compute_volume_mean_from_list.cpp:92] Check failed: size_in_datum == data_size (0 vs. 1050624) Incorrect data field size 0*** Check failure stack trace: ***    @     0x7f0e683a0daa  (unknown)    @     0x7f0e683a0ce4  (unknown)    @     0x7f0e683a06e6  (unknown)    @     0x7f0e683a3687  (unknown)    @           0x408d20  main    @     0x7f0e64f91f45  (unknown)    @           0x408edd  (unknown)    @              (nil)  (unknown)Aborted (core dumped)
  • 修改后成功生成ucf101_trian_mean.binaryproto文件
sdy@sdy:~/Git/C3D/examples/c3d_train_ucf101$ sh create_volume_mean.shI0108 22:02:42.597549 22827 compute_volume_mean_from_list.cpp:53] using dropping rate 10I0108 22:02:42.662868 22827 compute_volume_mean_from_list.cpp:80] Starting IterationE0108 22:05:40.792213 22827 compute_volume_mean_from_list.cpp:106] Processed 10000 files.E0108 22:05:50.637400 22827 compute_volume_mean_from_list.cpp:112] Processed 10725 files.I0108 22:05:50.639452 22827 compute_volume_mean_from_list.cpp:119] Write to ucf101_train_mean.binaryproto
  • 运行train_ucf101.sh
sdy@sdy:~/Git/C3D/examples/c3d_train_ucf101$ sh train_ucf101.shI0108 22:10:21.996961 22888 train_net.cpp:26] Starting OptimizationI0108 22:10:21.997050 22888 solver.cpp:41] Creating training net.I0108 22:10:21.997517 22888 net.cpp:76] Creating Layer dataI0108 22:10:21.997531 22888 net.cpp:112] data -> dataI0108 22:10:21.997541 22888 net.cpp:112] data -> labelI0108 22:10:21.997555 22888 video_data_layer.cpp:283] Opening file ../c3d_finetuning/train_01.lstI0108 22:10:22.070096 22888 video_data_layer.cpp:312] Shuffling dataI0108 22:10:22.391844 22888 video_data_layer.cpp:317] A total of 107258 video chunks.I0108 22:10:22.391902 22888 video_data_layer.cpp:344] read video from /home/sdy/Git/C3D/data/UCF101/frames/v_JumpingJack_g25_c05/I0108 22:10:22.410243 22888 video_data_layer.cpp:365] output data size: 30,3,16,112,112I0108 22:10:22.410284 22888 video_data_layer.cpp:387] Loading mean file from ucf101_train_mean.binaryprotoI0108 22:10:22.429491 22888 net.cpp:127] Top shape: 30 3 16 112 112 (18063360)I0108 22:10:22.429550 22888 net.cpp:127] Top shape: 30 1 1 1 1 (30)I0108 22:10:22.429569 22888 net.cpp:159] data does not need backward computation.I0108 22:10:22.429590 22888 net.cpp:76] Creating Layer conv1aI0108 22:10:22.429605 22888 net.cpp:86] conv1a <- dataI0108 22:10:22.429625 22888 net.cpp:112] conv1a -> conv1aI0108 22:10:22.595605 22888 net.cpp:127] Top shape: 30 64 16 112 112 (385351680)I0108 22:10:22.595657 22888 net.cpp:154] conv1a needs backward computation.I0108 22:10:22.595681 22888 net.cpp:76] Creating Layer relu1aI0108 22:10:22.595695 22888 net.cpp:86] relu1a <- conv1aI0108 22:10:22.595716 22888 net.cpp:100] relu1a -> conv1a (in-place)I0108 22:10:22.595736 22888 net.cpp:127] Top shape: 30 64 16 112 112 (385351680)I0108 22:10:22.595749 22888 net.cpp:154] relu1a needs backward computation.I0108 22:10:22.595764 22888 net.cpp:76] Creating Layer pool1I0108 22:10:22.595777 22888 net.cpp:86] pool1 <- conv1aI0108 22:10:22.595790 22888 net.cpp:112] pool1 -> pool1I0108 22:10:22.598541 22888 net.cpp:127] Top shape: 30 64 16 56 56 (96337920)I0108 22:10:22.598567 22888 net.cpp:154] pool1 needs backward computation.I0108 22:10:22.598587 22888 net.cpp:76] Creating Layer conv2aI0108 22:10:22.598598 22888 net.cpp:86] conv2a <- pool1I0108 22:10:22.598610 22888 net.cpp:112] conv2a -> conv2aI0108 22:10:22.605746 22888 net.cpp:127] Top shape: 30 128 16 56 56 (192675840)I0108 22:10:22.605775 22888 net.cpp:154] conv2a needs backward computation.I0108 22:10:22.605792 22888 net.cpp:76] Creating Layer relu2aI0108 22:10:22.605809 22888 net.cpp:86] relu2a <- conv2aI0108 22:10:22.605823 22888 net.cpp:100] relu2a -> conv2a (in-place)I0108 22:10:22.605852 22888 net.cpp:127] Top shape: 30 128 16 56 56 (192675840)I0108 22:10:22.605873 22888 net.cpp:154] relu2a needs backward computation.I0108 22:10:22.605896 22888 net.cpp:76] Creating Layer pool2I0108 22:10:22.605916 22888 net.cpp:86] pool2 <- conv2aI0108 22:10:22.605937 22888 net.cpp:112] pool2 -> pool2I0108 22:10:22.605958 22888 net.cpp:127] Top shape: 30 128 8 28 28 (24084480)I0108 22:10:22.605978 22888 net.cpp:154] pool2 needs backward computation.I0108 22:10:22.605993 22888 net.cpp:76] Creating Layer conv3aI0108 22:10:22.606014 22888 net.cpp:86] conv3a <- pool2I0108 22:10:22.606034 22888 net.cpp:112] conv3a -> conv3aI0108 22:10:22.634486 22888 net.cpp:127] Top shape: 30 256 8 28 28 (48168960)I0108 22:10:22.634656 22888 net.cpp:154] conv3a needs backward computation.I0108 22:10:22.634786 22888 net.cpp:76] Creating Layer relu3aI0108 22:10:22.634914 22888 net.cpp:86] relu3a <- conv3aI0108 22:10:22.635069 22888 net.cpp:100] relu3a -> conv3a (in-place)I0108 22:10:22.635203 22888 net.cpp:127] Top shape: 30 256 8 28 28 (48168960)I0108 22:10:22.635341 22888 net.cpp:154] relu3a needs backward computation.I0108 22:10:22.635465 22888 net.cpp:76] Creating Layer pool3I0108 22:10:22.635601 22888 net.cpp:86] pool3 <- conv3aI0108 22:10:22.635732 22888 net.cpp:112] pool3 -> pool3I0108 22:10:22.635854 22888 net.cpp:127] Top shape: 30 256 4 14 14 (6021120)I0108 22:10:22.635974 22888 net.cpp:154] pool3 needs backward computation.I0108 22:10:22.636106 22888 net.cpp:76] Creating Layer conv4aI0108 22:10:22.636245 22888 net.cpp:86] conv4a <- pool3I0108 22:10:22.636371 22888 net.cpp:112] conv4a -> conv4aI0108 22:10:22.694723 22888 net.cpp:127] Top shape: 30 256 4 14 14 (6021120)I0108 22:10:22.694777 22888 net.cpp:154] conv4a needs backward computation.I0108 22:10:22.694797 22888 net.cpp:76] Creating Layer relu4aI0108 22:10:22.694818 22888 net.cpp:86] relu4a <- conv4aI0108 22:10:22.694831 22888 net.cpp:100] relu4a -> conv4a (in-place)I0108 22:10:22.694844 22888 net.cpp:127] Top shape: 30 256 4 14 14 (6021120)I0108 22:10:22.694855 22888 net.cpp:154] relu4a needs backward computation.I0108 22:10:22.694867 22888 net.cpp:76] Creating Layer pool4I0108 22:10:22.694877 22888 net.cpp:86] pool4 <- conv4aI0108 22:10:22.694890 22888 net.cpp:112] pool4 -> pool4I0108 22:10:22.694903 22888 net.cpp:127] Top shape: 30 256 2 7 7 (752640)I0108 22:10:22.694913 22888 net.cpp:154] pool4 needs backward computation.I0108 22:10:22.694928 22888 net.cpp:76] Creating Layer conv5aI0108 22:10:22.694938 22888 net.cpp:86] conv5a <- pool4I0108 22:10:22.694949 22888 net.cpp:112] conv5a -> conv5aI0108 22:10:22.773874 22888 net.cpp:127] Top shape: 30 256 2 7 7 (752640)I0108 22:10:22.773926 22888 net.cpp:154] conv5a needs backward computation.I0108 22:10:22.773949 22888 net.cpp:76] Creating Layer relu5aI0108 22:10:22.773969 22888 net.cpp:86] relu5a <- conv5aI0108 22:10:22.773988 22888 net.cpp:100] relu5a -> conv5a (in-place)I0108 22:10:22.774005 22888 net.cpp:127] Top shape: 30 256 2 7 7 (752640)I0108 22:10:22.774024 22888 net.cpp:154] relu5a needs backward computation.I0108 22:10:22.774041 22888 net.cpp:76] Creating Layer pool5I0108 22:10:22.774058 22888 net.cpp:86] pool5 <- conv5aI0108 22:10:22.774075 22888 net.cpp:112] pool5 -> pool5I0108 22:10:22.774094 22888 net.cpp:127] Top shape: 30 256 1 4 4 (122880)I0108 22:10:22.774111 22888 net.cpp:154] pool5 needs backward computation.I0108 22:10:22.774137 22888 net.cpp:76] Creating Layer fc6I0108 22:10:22.774155 22888 net.cpp:86] fc6 <- pool5I0108 22:10:22.774173 22888 net.cpp:112] fc6 -> fc6I0108 22:10:23.052224 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)I0108 22:10:23.052269 22888 net.cpp:154] fc6 needs backward computation.I0108 22:10:23.052285 22888 net.cpp:76] Creating Layer relu6I0108 22:10:23.052304 22888 net.cpp:86] relu6 <- fc6I0108 22:10:23.052322 22888 net.cpp:100] relu6 -> fc6 (in-place)I0108 22:10:23.052332 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)I0108 22:10:23.052350 22888 net.cpp:154] relu6 needs backward computation.I0108 22:10:23.052361 22888 net.cpp:76] Creating Layer drop6I0108 22:10:23.052369 22888 net.cpp:86] drop6 <- fc6I0108 22:10:23.052379 22888 net.cpp:100] drop6 -> fc6 (in-place)I0108 22:10:23.052394 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)I0108 22:10:23.052404 22888 net.cpp:154] drop6 needs backward computation.I0108 22:10:23.052419 22888 net.cpp:76] Creating Layer fc7I0108 22:10:23.052428 22888 net.cpp:86] fc7 <- fc6I0108 22:10:23.052438 22888 net.cpp:112] fc7 -> fc7I0108 22:10:23.188469 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)I0108 22:10:23.188519 22888 net.cpp:154] fc7 needs backward computation.I0108 22:10:23.188541 22888 net.cpp:76] Creating Layer relu7I0108 22:10:23.188555 22888 net.cpp:86] relu7 <- fc7I0108 22:10:23.188575 22888 net.cpp:100] relu7 -> fc7 (in-place)I0108 22:10:23.188585 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)I0108 22:10:23.188594 22888 net.cpp:154] relu7 needs backward computation.I0108 22:10:23.188606 22888 net.cpp:76] Creating Layer drop7I0108 22:10:23.188614 22888 net.cpp:86] drop7 <- fc7I0108 22:10:23.188623 22888 net.cpp:100] drop7 -> fc7 (in-place)I0108 22:10:23.188633 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)I0108 22:10:23.188643 22888 net.cpp:154] drop7 needs backward computation.I0108 22:10:23.188657 22888 net.cpp:76] Creating Layer fc8I0108 22:10:23.188699 22888 net.cpp:86] fc8 <- fc7I0108 22:10:23.188721 22888 net.cpp:112] fc8 -> fc8I0108 22:10:23.195559 22888 net.cpp:127] Top shape: 30 101 1 1 1 (3030)I0108 22:10:23.195588 22888 net.cpp:154] fc8 needs backward computation.I0108 22:10:23.195605 22888 net.cpp:76] Creating Layer lossI0108 22:10:23.195618 22888 net.cpp:86] loss <- fc8I0108 22:10:23.195628 22888 net.cpp:86] loss <- labelI0108 22:10:23.195967 22888 net.cpp:154] loss needs backward computation.I0108 22:10:23.196009 22888 net.cpp:183] Collecting Learning Rate and Weight Decay.I0108 22:10:23.196032 22888 net.cpp:176] Network initialization done.I0108 22:10:23.196040 22888 net.cpp:177] Memory required for Data 3113914320I0108 22:10:23.196116 22888 solver.cpp:44] Creating testing net.I0108 22:10:23.197083 22888 net.cpp:76] Creating Layer dataI0108 22:10:23.197106 22888 net.cpp:112] data -> dataI0108 22:10:23.197119 22888 net.cpp:112] data -> labelI0108 22:10:23.197134 22888 video_data_layer.cpp:283] Opening file ../c3d_finetuning/test_01.lstI0108 22:10:23.227228 22888 video_data_layer.cpp:312] Shuffling dataI0108 22:10:23.228469 22888 video_data_layer.cpp:317] A total of 41822 video chunks.I0108 22:10:23.228487 22888 video_data_layer.cpp:344] read video from /home/sdy/Git/C3D/data/UCF101/frames/v_HammerThrow_g03_c02/I0108 22:10:23.252522 22888 video_data_layer.cpp:365] output data size: 30,3,16,112,112I0108 22:10:23.252559 22888 video_data_layer.cpp:387] Loading mean file from ucf101_train_mean.binaryprotoI0108 22:10:23.273470 22888 net.cpp:127] Top shape: 30 3 16 112 112 (18063360)I0108 22:10:23.273516 22888 net.cpp:127] Top shape: 30 1 1 1 1 (30)I0108 22:10:23.273537 22888 net.cpp:159] data does not need backward computation.I0108 22:10:23.273558 22888 net.cpp:76] Creating Layer conv1aI0108 22:10:23.273573 22888 net.cpp:86] conv1a <- dataI0108 22:10:23.273586 22888 net.cpp:112] conv1a -> conv1aI0108 22:10:23.273968 22888 net.cpp:127] Top shape: 30 64 16 112 112 (385351680)I0108 22:10:23.274107 22888 net.cpp:154] conv1a needs backward computation.I0108 22:10:23.274211 22888 net.cpp:76] Creating Layer relu1aI0108 22:10:23.274319 22888 net.cpp:86] relu1a <- conv1aI0108 22:10:23.274411 22888 net.cpp:100] relu1a -> conv1a (in-place)I0108 22:10:23.274513 22888 net.cpp:127] Top shape: 30 64 16 112 112 (385351680)I0108 22:10:23.274618 22888 net.cpp:154] relu1a needs backward computation.I0108 22:10:23.274719 22888 net.cpp:76] Creating Layer pool1I0108 22:10:23.274817 22888 net.cpp:86] pool1 <- conv1aI0108 22:10:23.274917 22888 net.cpp:112] pool1 -> pool1I0108 22:10:23.275020 22888 net.cpp:127] Top shape: 30 64 16 56 56 (96337920)I0108 22:10:23.275104 22888 net.cpp:154] pool1 needs backward computation.I0108 22:10:23.275205 22888 net.cpp:76] Creating Layer conv2aI0108 22:10:23.275306 22888 net.cpp:86] conv2a <- pool1I0108 22:10:23.275405 22888 net.cpp:112] conv2a -> conv2aI0108 22:10:23.283287 22888 net.cpp:127] Top shape: 30 128 16 56 56 (192675840)I0108 22:10:23.283311 22888 net.cpp:154] conv2a needs backward computation.I0108 22:10:23.283332 22888 net.cpp:76] Creating Layer relu2aI0108 22:10:23.283350 22888 net.cpp:86] relu2a <- conv2aI0108 22:10:23.283368 22888 net.cpp:100] relu2a -> conv2a (in-place)I0108 22:10:23.283387 22888 net.cpp:127] Top shape: 30 128 16 56 56 (192675840)I0108 22:10:23.283404 22888 net.cpp:154] relu2a needs backward computation.I0108 22:10:23.283422 22888 net.cpp:76] Creating Layer pool2I0108 22:10:23.283439 22888 net.cpp:86] pool2 <- conv2aI0108 22:10:23.283458 22888 net.cpp:112] pool2 -> pool2I0108 22:10:23.283476 22888 net.cpp:127] Top shape: 30 128 8 28 28 (24084480)I0108 22:10:23.283493 22888 net.cpp:154] pool2 needs backward computation.I0108 22:10:23.283511 22888 net.cpp:76] Creating Layer conv3aI0108 22:10:23.283529 22888 net.cpp:86] conv3a <- pool2I0108 22:10:23.283545 22888 net.cpp:112] conv3a -> conv3aI0108 22:10:23.311955 22888 net.cpp:127] Top shape: 30 256 8 28 28 (48168960)I0108 22:10:23.311991 22888 net.cpp:154] conv3a needs backward computation.I0108 22:10:23.312014 22888 net.cpp:76] Creating Layer relu3aI0108 22:10:23.312033 22888 net.cpp:86] relu3a <- conv3aI0108 22:10:23.312052 22888 net.cpp:100] relu3a -> conv3a (in-place)I0108 22:10:23.312069 22888 net.cpp:127] Top shape: 30 256 8 28 28 (48168960)I0108 22:10:23.312086 22888 net.cpp:154] relu3a needs backward computation.I0108 22:10:23.312105 22888 net.cpp:76] Creating Layer pool3I0108 22:10:23.312122 22888 net.cpp:86] pool3 <- conv3aI0108 22:10:23.312140 22888 net.cpp:112] pool3 -> pool3I0108 22:10:23.312160 22888 net.cpp:127] Top shape: 30 256 4 14 14 (6021120)I0108 22:10:23.312177 22888 net.cpp:154] pool3 needs backward computation.I0108 22:10:23.312196 22888 net.cpp:76] Creating Layer conv4aI0108 22:10:23.312212 22888 net.cpp:86] conv4a <- pool3I0108 22:10:23.312229 22888 net.cpp:112] conv4a -> conv4aI0108 22:10:23.369153 22888 net.cpp:127] Top shape: 30 256 4 14 14 (6021120)I0108 22:10:23.369189 22888 net.cpp:154] conv4a needs backward computation.I0108 22:10:23.369210 22888 net.cpp:76] Creating Layer relu4aI0108 22:10:23.369228 22888 net.cpp:86] relu4a <- conv4aI0108 22:10:23.369247 22888 net.cpp:100] relu4a -> conv4a (in-place)I0108 22:10:23.369257 22888 net.cpp:127] Top shape: 30 256 4 14 14 (6021120)I0108 22:10:23.369267 22888 net.cpp:154] relu4a needs backward computation.I0108 22:10:23.369277 22888 net.cpp:76] Creating Layer pool4I0108 22:10:23.369287 22888 net.cpp:86] pool4 <- conv4aI0108 22:10:23.369297 22888 net.cpp:112] pool4 -> pool4I0108 22:10:23.369307 22888 net.cpp:127] Top shape: 30 256 2 7 7 (752640)I0108 22:10:23.369318 22888 net.cpp:154] pool4 needs backward computation.I0108 22:10:23.369329 22888 net.cpp:76] Creating Layer conv5aI0108 22:10:23.369338 22888 net.cpp:86] conv5a <- pool4I0108 22:10:23.369349 22888 net.cpp:112] conv5a -> conv5aI0108 22:10:23.427034 22888 net.cpp:127] Top shape: 30 256 2 7 7 (752640)I0108 22:10:23.427076 22888 net.cpp:154] conv5a needs backward computation.I0108 22:10:23.427098 22888 net.cpp:76] Creating Layer relu5aI0108 22:10:23.427117 22888 net.cpp:86] relu5a <- conv5aI0108 22:10:23.427135 22888 net.cpp:100] relu5a -> conv5a (in-place)I0108 22:10:23.427146 22888 net.cpp:127] Top shape: 30 256 2 7 7 (752640)I0108 22:10:23.427165 22888 net.cpp:154] relu5a needs backward computation.I0108 22:10:23.427182 22888 net.cpp:76] Creating Layer pool5I0108 22:10:23.427199 22888 net.cpp:86] pool5 <- conv5aI0108 22:10:23.427214 22888 net.cpp:112] pool5 -> pool5I0108 22:10:23.427233 22888 net.cpp:127] Top shape: 30 256 1 4 4 (122880)I0108 22:10:23.427251 22888 net.cpp:154] pool5 needs backward computation.I0108 22:10:23.427274 22888 net.cpp:76] Creating Layer fc6I0108 22:10:23.427284 22888 net.cpp:86] fc6 <- pool5I0108 22:10:23.427300 22888 net.cpp:112] fc6 -> fc6I0108 22:10:23.692828 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)I0108 22:10:23.692981 22888 net.cpp:154] fc6 needs backward computation.I0108 22:10:23.693073 22888 net.cpp:76] Creating Layer relu6I0108 22:10:23.693157 22888 net.cpp:86] relu6 <- fc6I0108 22:10:23.693241 22888 net.cpp:100] relu6 -> fc6 (in-place)I0108 22:10:23.693325 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)I0108 22:10:23.693408 22888 net.cpp:154] relu6 needs backward computation.I0108 22:10:23.693492 22888 net.cpp:76] Creating Layer drop6I0108 22:10:23.693577 22888 net.cpp:86] drop6 <- fc6I0108 22:10:23.693660 22888 net.cpp:100] drop6 -> fc6 (in-place)I0108 22:10:23.693743 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)I0108 22:10:23.693826 22888 net.cpp:154] drop6 needs backward computation.I0108 22:10:23.693910 22888 net.cpp:76] Creating Layer fc7I0108 22:10:23.693990 22888 net.cpp:86] fc7 <- fc6I0108 22:10:23.694073 22888 net.cpp:112] fc7 -> fc7I0108 22:10:23.833493 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)I0108 22:10:23.833534 22888 net.cpp:154] fc7 needs backward computation.I0108 22:10:23.833549 22888 net.cpp:76] Creating Layer relu7I0108 22:10:23.833559 22888 net.cpp:86] relu7 <- fc7I0108 22:10:23.833571 22888 net.cpp:100] relu7 -> fc7 (in-place)I0108 22:10:23.833581 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)I0108 22:10:23.833590 22888 net.cpp:154] relu7 needs backward computation.I0108 22:10:23.833601 22888 net.cpp:76] Creating Layer drop7I0108 22:10:23.833611 22888 net.cpp:86] drop7 <- fc7I0108 22:10:23.833621 22888 net.cpp:100] drop7 -> fc7 (in-place)I0108 22:10:23.833631 22888 net.cpp:127] Top shape: 30 2048 1 1 1 (61440)I0108 22:10:23.833640 22888 net.cpp:154] drop7 needs backward computation.I0108 22:10:23.833652 22888 net.cpp:76] Creating Layer fc8I0108 22:10:23.833662 22888 net.cpp:86] fc8 <- fc7I0108 22:10:23.833673 22888 net.cpp:112] fc8 -> fc8I0108 22:10:23.840504 22888 net.cpp:127] Top shape: 30 101 1 1 1 (3030)I0108 22:10:23.840533 22888 net.cpp:154] fc8 needs backward computation.I0108 22:10:23.840553 22888 net.cpp:76] Creating Layer probI0108 22:10:23.840569 22888 net.cpp:86] prob <- fc8I0108 22:10:23.840589 22888 net.cpp:112] prob -> probI0108 22:10:23.840607 22888 net.cpp:127] Top shape: 30 101 1 1 1 (3030)I0108 22:10:23.840625 22888 net.cpp:154] prob needs backward computation.I0108 22:10:23.840642 22888 net.cpp:76] Creating Layer accuracyI0108 22:10:23.840659 22888 net.cpp:86] accuracy <- probI0108 22:10:23.840703 22888 net.cpp:86] accuracy <- labelI0108 22:10:23.840724 22888 net.cpp:112] accuracy -> accuracyI0108 22:10:23.840745 22888 net.cpp:127] Top shape: 1 2 1 1 1 (2)I0108 22:10:23.840764 22888 net.cpp:154] accuracy needs backward computation.I0108 22:10:23.840780 22888 net.cpp:165] This network produces output accuracyI0108 22:10:23.840811 22888 net.cpp:183] Collecting Learning Rate and Weight Decay.I0108 22:10:23.840834 22888 net.cpp:176] Network initialization done.I0108 22:10:23.840852 22888 net.cpp:177] Memory required for Data 3113926448I0108 22:10:23.840927 22888 solver.cpp:49] Solver scaffolding done.I0108 22:10:24.079597 22888 solver.cpp:61] Solving deep_c3d_ucf101I0108 22:10:24.079658 22888 solver.cpp:106] Iteration 0, Testing netI0108 22:11:26.784036 22888 solver.cpp:142] Test score #0: 0.00733333I0108 22:11:26.784090 22888 solver.cpp:142] Test score #1: 4.69726I0108 22:12:01.860625 22888 solver.cpp:237] Iteration 20, lr = 0.003I0108 22:12:01.869343 22888 solver.cpp:87] Iteration 20, loss = 5.26183I0108 22:12:37.542536 22888 solver.cpp:237] Iteration 40, lr = 0.003I0108 22:12:37.542987 22888 solver.cpp:87] Iteration 40, loss = 4.71496I0108 22:13:13.279296 22888 solver.cpp:237] Iteration 60, lr = 0.003I0108 22:13:13.279765 22888 solver.cpp:87] Iteration 60, loss = 4.70241I0108 22:13:49.041620 22888 solver.cpp:237] Iteration 80, lr = 0.003I0108 22:13:49.056371 22888 solver.cpp:87] Iteration 80, loss = 4.65352

  1. D. Tran, L. Bourdev, R. Fergus, L. Torresani, and M. Paluri. Learning spatiotemporal features with 3d convolutional net- works. In ICCV, 2015. ↩
0 0
原创粉丝点击