@@ -0,0 +1,202 @@ | |||||
Apache License | |||||
Version 2.0, January 2004 | |||||
http://www.apache.org/licenses/ | |||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION | |||||
1. Definitions. | |||||
"License" shall mean the terms and conditions for use, reproduction, | |||||
and distribution as defined by Sections 1 through 9 of this document. | |||||
"Licensor" shall mean the copyright owner or entity authorized by | |||||
the copyright owner that is granting the License. | |||||
"Legal Entity" shall mean the union of the acting entity and all | |||||
other entities that control, are controlled by, or are under common | |||||
control with that entity. For the purposes of this definition, | |||||
"control" means (i) the power, direct or indirect, to cause the | |||||
direction or management of such entity, whether by contract or | |||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the | |||||
outstanding shares, or (iii) beneficial ownership of such entity. | |||||
"You" (or "Your") shall mean an individual or Legal Entity | |||||
exercising permissions granted by this License. | |||||
"Source" form shall mean the preferred form for making modifications, | |||||
including but not limited to software source code, documentation | |||||
source, and configuration files. | |||||
"Object" form shall mean any form resulting from mechanical | |||||
transformation or translation of a Source form, including but | |||||
not limited to compiled object code, generated documentation, | |||||
and conversions to other media types. | |||||
"Work" shall mean the work of authorship, whether in Source or | |||||
Object form, made available under the License, as indicated by a | |||||
copyright notice that is included in or attached to the work | |||||
(an example is provided in the Appendix below). | |||||
"Derivative Works" shall mean any work, whether in Source or Object | |||||
form, that is based on (or derived from) the Work and for which the | |||||
editorial revisions, annotations, elaborations, or other modifications | |||||
represent, as a whole, an original work of authorship. For the purposes | |||||
of this License, Derivative Works shall not include works that remain | |||||
separable from, or merely link (or bind by name) to the interfaces of, | |||||
the Work and Derivative Works thereof. | |||||
"Contribution" shall mean any work of authorship, including | |||||
the original version of the Work and any modifications or additions | |||||
to that Work or Derivative Works thereof, that is intentionally | |||||
submitted to Licensor for inclusion in the Work by the copyright owner | |||||
or by an individual or Legal Entity authorized to submit on behalf of | |||||
the copyright owner. For the purposes of this definition, "submitted" | |||||
means any form of electronic, verbal, or written communication sent | |||||
to the Licensor or its representatives, including but not limited to | |||||
communication on electronic mailing lists, source code control systems, | |||||
and issue tracking systems that are managed by, or on behalf of, the | |||||
Licensor for the purpose of discussing and improving the Work, but | |||||
excluding communication that is conspicuously marked or otherwise | |||||
designated in writing by the copyright owner as "Not a Contribution." | |||||
"Contributor" shall mean Licensor and any individual or Legal Entity | |||||
on behalf of whom a Contribution has been received by Licensor and | |||||
subsequently incorporated within the Work. | |||||
2. Grant of Copyright License. Subject to the terms and conditions of | |||||
this License, each Contributor hereby grants to You a perpetual, | |||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable | |||||
copyright license to reproduce, prepare Derivative Works of, | |||||
publicly display, publicly perform, sublicense, and distribute the | |||||
Work and such Derivative Works in Source or Object form. | |||||
3. Grant of Patent License. Subject to the terms and conditions of | |||||
this License, each Contributor hereby grants to You a perpetual, | |||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable | |||||
(except as stated in this section) patent license to make, have made, | |||||
use, offer to sell, sell, import, and otherwise transfer the Work, | |||||
where such license applies only to those patent claims licensable | |||||
by such Contributor that are necessarily infringed by their | |||||
Contribution(s) alone or by combination of their Contribution(s) | |||||
with the Work to which such Contribution(s) was submitted. If You | |||||
institute patent litigation against any entity (including a | |||||
cross-claim or counterclaim in a lawsuit) alleging that the Work | |||||
or a Contribution incorporated within the Work constitutes direct | |||||
or contributory patent infringement, then any patent licenses | |||||
granted to You under this License for that Work shall terminate | |||||
as of the date such litigation is filed. | |||||
4. Redistribution. You may reproduce and distribute copies of the | |||||
Work or Derivative Works thereof in any medium, with or without | |||||
modifications, and in Source or Object form, provided that You | |||||
meet the following conditions: | |||||
(a) You must give any other recipients of the Work or | |||||
Derivative Works a copy of this License; and | |||||
(b) You must cause any modified files to carry prominent notices | |||||
stating that You changed the files; and | |||||
(c) You must retain, in the Source form of any Derivative Works | |||||
that You distribute, all copyright, patent, trademark, and | |||||
attribution notices from the Source form of the Work, | |||||
excluding those notices that do not pertain to any part of | |||||
the Derivative Works; and | |||||
(d) If the Work includes a "NOTICE" text file as part of its | |||||
distribution, then any Derivative Works that You distribute must | |||||
include a readable copy of the attribution notices contained | |||||
within such NOTICE file, excluding those notices that do not | |||||
pertain to any part of the Derivative Works, in at least one | |||||
of the following places: within a NOTICE text file distributed | |||||
as part of the Derivative Works; within the Source form or | |||||
documentation, if provided along with the Derivative Works; or, | |||||
within a display generated by the Derivative Works, if and | |||||
wherever such third-party notices normally appear. The contents | |||||
of the NOTICE file are for informational purposes only and | |||||
do not modify the License. You may add Your own attribution | |||||
notices within Derivative Works that You distribute, alongside | |||||
or as an addendum to the NOTICE text from the Work, provided | |||||
that such additional attribution notices cannot be construed | |||||
as modifying the License. | |||||
You may add Your own copyright statement to Your modifications and | |||||
may provide additional or different license terms and conditions | |||||
for use, reproduction, or distribution of Your modifications, or | |||||
for any such Derivative Works as a whole, provided Your use, | |||||
reproduction, and distribution of the Work otherwise complies with | |||||
the conditions stated in this License. | |||||
5. Submission of Contributions. Unless You explicitly state otherwise, | |||||
any Contribution intentionally submitted for inclusion in the Work | |||||
by You to the Licensor shall be under the terms and conditions of | |||||
this License, without any additional terms or conditions. | |||||
Notwithstanding the above, nothing herein shall supersede or modify | |||||
the terms of any separate license agreement you may have executed | |||||
with Licensor regarding such Contributions. | |||||
6. Trademarks. This License does not grant permission to use the trade | |||||
names, trademarks, service marks, or product names of the Licensor, | |||||
except as required for reasonable and customary use in describing the | |||||
origin of the Work and reproducing the content of the NOTICE file. | |||||
7. Disclaimer of Warranty. Unless required by applicable law or | |||||
agreed to in writing, Licensor provides the Work (and each | |||||
Contributor provides its Contributions) on an "AS IS" BASIS, | |||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or | |||||
implied, including, without limitation, any warranties or conditions | |||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A | |||||
PARTICULAR PURPOSE. You are solely responsible for determining the | |||||
appropriateness of using or redistributing the Work and assume any | |||||
risks associated with Your exercise of permissions under this License. | |||||
8. Limitation of Liability. In no event and under no legal theory, | |||||
whether in tort (including negligence), contract, or otherwise, | |||||
unless required by applicable law (such as deliberate and grossly | |||||
negligent acts) or agreed to in writing, shall any Contributor be | |||||
liable to You for damages, including any direct, indirect, special, | |||||
incidental, or consequential damages of any character arising as a | |||||
result of this License or out of the use or inability to use the | |||||
Work (including but not limited to damages for loss of goodwill, | |||||
work stoppage, computer failure or malfunction, or any and all | |||||
other commercial damages or losses), even if such Contributor | |||||
has been advised of the possibility of such damages. | |||||
9. Accepting Warranty or Additional Liability. While redistributing | |||||
the Work or Derivative Works thereof, You may choose to offer, | |||||
and charge a fee for, acceptance of support, warranty, indemnity, | |||||
or other liability obligations and/or rights consistent with this | |||||
License. However, in accepting such obligations, You may act only | |||||
on Your own behalf and on Your sole responsibility, not on behalf | |||||
of any other Contributor, and only if You agree to indemnify, | |||||
defend, and hold each Contributor harmless for any liability | |||||
incurred by, or claims asserted against, such Contributor by reason | |||||
of your accepting any such warranty or additional liability. | |||||
END OF TERMS AND CONDITIONS | |||||
APPENDIX: How to apply the Apache License to your work. | |||||
To apply the Apache License to your work, attach the following | |||||
boilerplate notice, with the fields enclosed by brackets "[]" | |||||
replaced with your own identifying information. (Don't include | |||||
the brackets!) The text should be enclosed in the appropriate | |||||
comment syntax for the file format. We also recommend that a | |||||
file or class name and description of purpose be included on the | |||||
same "printed page" as the copyright notice for easier | |||||
identification within third-party archives. | |||||
Copyright [yyyy] [name of copyright owner] | |||||
Licensed under the Apache License, Version 2.0 (the "License"); | |||||
you may not use this file except in compliance with the License. | |||||
You may obtain a copy of the License at | |||||
http://www.apache.org/licenses/LICENSE-2.0 | |||||
Unless required by applicable law or agreed to in writing, software | |||||
distributed under the License is distributed on an "AS IS" BASIS, | |||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||||
See the License for the specific language governing permissions and | |||||
limitations under the License. |
@@ -1,30 +1,33 @@ | |||||
# 数据处理模块 | |||||
该模块提供平台数据处理中自动标注、多目标跟踪、数据增强以及OFRecord转换服务 | |||||
# 之江天枢-算法端 | |||||
## 依赖 | |||||
**之江天枢一站式人工智能开源平台**(简称:**之江天枢**),包括海量数据处理、交互式模型构建(包含Notebook和模型可视化)、AI模型高效训练。多维度产品形态满足从开发者到大型企业的不同需求,将提升人工智能技术的研发效率、扩大算法模型的应用范围,进一步构建人工智能生态“朋友圈”。 | |||||
- opencv-python | |||||
- numpy | |||||
- web.py | |||||
- oneflow | |||||
- Pillow | |||||
- scipy | |||||
## 算法部署 | |||||
部署请参考 http://tianshu.org.cn/?/course 中文档**部署数据处理算法** | |||||
## 代码结构: | |||||
## 服务启动 | |||||
- 自动标注 | |||||
```bash | |||||
python oneflow/run_label_server.py -p xxxx -m xxx | |||||
python oneflow/imagenet_server.py -p xxxx -m xxx | |||||
``` | |||||
- 数据增强 | |||||
```bash | |||||
python oneflow/img_process_server.py -p xxxx -m xxx | |||||
``` | |||||
- 目标跟踪 | |||||
```bash | |||||
python oneflow/track_srver.py -p xxxx -m xxx | |||||
``` | ``` | ||||
- OFRecord转换 | |||||
```bash | |||||
python oneflow/ofrecord_server.py -p xxxx -m xxx | |||||
├── LICENSE | |||||
├── README.md | |||||
├── algorithm-annotation.py #目标检测和图像分类算法 | |||||
├── algorithm-imagenet.py #图像分类中imagenet标签处理算法 | |||||
├── algorithm-imgprocess.py #数据增强算法 | |||||
├── algorithm-ofrecord.py #ofrecord数据转换算法 | |||||
├── algorithm-track.py #跟踪算法 | |||||
├── algorithm-videosample.py #视频采样算法 | |||||
├── annotation.py | |||||
├── common #基础工具 | |||||
├── data | |||||
├── imagenet.py | |||||
├── imgprocess.py | |||||
├── luascript | |||||
├── of_model #oneflow模型文件 | |||||
├── ofrecord.py | |||||
├── predict_with_print_box.py | |||||
├── taskexecutor.py | |||||
├── track.py | |||||
├── track_only | |||||
└── videosample.py | |||||
``` | ``` |
@@ -0,0 +1,58 @@ | |||||
""" | |||||
/** | |||||
* Copyright 2020 Zhejiang Lab. All Rights Reserved. | |||||
* | |||||
* Licensed under the Apache License, Version 2.0 (the "License"); | |||||
* you may not use this file except in compliance with the License. | |||||
* You may obtain a copy of the License at | |||||
* | |||||
* http://www.apache.org/licenses/LICENSE-2.0 | |||||
* | |||||
* Unless required by applicable law or agreed to in writing, software | |||||
* distributed under the License is distributed on an "AS IS" BASIS, | |||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||||
* See the License for the specific language governing permissions and | |||||
* limitations under the License. | |||||
* ============================================================= | |||||
*/ | |||||
""" | |||||
# coding:utf-8 | |||||
import threading | |||||
import taskexecutor | |||||
import time | |||||
import common.RedisUtil as f | |||||
import common.config as config | |||||
import annotation as annotation | |||||
import luascript.starttaskscript as start_script | |||||
import logging | |||||
logging.basicConfig(format='%(asctime)s - %(pathname)s[line:%(lineno)d] - %(levelname)s: %(message)s', | |||||
level=logging.DEBUG) | |||||
if __name__ == '__main__': | |||||
"""Automatic annotation algorithm entry.""" | |||||
jsonData = config.loadJsonData(config.configPath) | |||||
redisClient = f.getRedisConnection(jsonData["ip"], jsonData["port"], jsonData["database"], jsonData["password"]) | |||||
logging.info('init redis client %s', redisClient) | |||||
t = threading.Thread(target=taskexecutor.delayKeyThread, args=(redisClient,)) | |||||
t.setDaemon(True) | |||||
t.start() | |||||
annotation._init() | |||||
while 1: | |||||
try: | |||||
if config.loadJsonData(config.sign) == 0: | |||||
logging.info('not to execute new task') | |||||
time.sleep(1) | |||||
else: | |||||
logging.info('get one task') | |||||
element = redisClient.eval(start_script.startTaskLua, 1, config.queue, | |||||
config.annotationStartQueue, int(time.time())) | |||||
if len(element) > 0: | |||||
taskexecutor.annotationExecutor(redisClient, element[0]); | |||||
else: | |||||
logging.info('task queue is empty.') | |||||
time.sleep(1) | |||||
except Exception as e: | |||||
logging.error('except:', e) | |||||
time.sleep(1) |
@@ -0,0 +1,65 @@ | |||||
""" | |||||
/** | |||||
* Copyright 2020 Zhejiang Lab. All Rights Reserved. | |||||
* | |||||
* Licensed under the Apache License, Version 2.0 (the "License"); | |||||
* you may not use this file except in compliance with the License. | |||||
* You may obtain a copy of the License at | |||||
* | |||||
* http://www.apache.org/licenses/LICENSE-2.0 | |||||
* | |||||
* Unless required by applicable law or agreed to in writing, software | |||||
* distributed under the License is distributed on an "AS IS" BASIS, | |||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||||
* See the License for the specific language governing permissions and | |||||
* limitations under the License. | |||||
* ============================================================= | |||||
*/ | |||||
""" | |||||
# coding:utf-8 | |||||
import json | |||||
import threading | |||||
import time | |||||
import imagenet as imagenet | |||||
import common.RedisUtil as f | |||||
import common.config as config | |||||
import luascript.starttaskscript as start_script | |||||
import logging | |||||
logging.basicConfig(format='%(asctime)s - %(pathname)s[line:%(lineno)d] - %(levelname)s: %(message)s', | |||||
level=logging.DEBUG) | |||||
if __name__ == '__main__': | |||||
"""Imagenet algorithm entry.""" | |||||
jsonData = config.loadJsonData(config.configPath) | |||||
redisClient = f.getRedisConnection(jsonData["ip"], jsonData["port"], jsonData["database"], jsonData["password"]) | |||||
logging.info('init redis client %s', redisClient) | |||||
t = threading.Thread(target=imagenet.delayKeyThread, args=(redisClient,)) | |||||
t.setDaemon(True) | |||||
t.start() | |||||
imagenet._init() | |||||
while 1: | |||||
try: | |||||
if config.loadJsonData(config.sign) == 0: | |||||
logging.info('not to execute new task') | |||||
time.sleep(1) | |||||
else: | |||||
logging.info('get one task') | |||||
element = redisClient.eval(start_script.startTaskLua, 1, config.imagenetTaskQueue, | |||||
config.imagenetStartQueue, int(time.time())) | |||||
if len(element) > 0: | |||||
key = element[0].decode() | |||||
jsonStr = f.getByKey(redisClient, key.replace('"', '')); | |||||
result = imagenet.process(jsonStr, element[0]) | |||||
logging.info("result:", json.dumps(result)) | |||||
logging.info('save result to redis') | |||||
f.pushToQueue(redisClient, config.imagenetFinishQueue, json.dumps(result)) | |||||
redisClient.zrem(config.imagenetStartQueue, element[0]) | |||||
else: | |||||
logging.info('task queue is empty.') | |||||
time.sleep(2) | |||||
except Exception as e: | |||||
logging.error('except:', e) | |||||
time.sleep(1) |
@@ -0,0 +1,54 @@ | |||||
""" | |||||
/** | |||||
* Copyright 2020 Zhejiang Lab. All Rights Reserved. | |||||
* | |||||
* Licensed under the Apache License, Version 2.0 (the "License"); | |||||
* you may not use this file except in compliance with the License. | |||||
* You may obtain a copy of the License at | |||||
* | |||||
* http://www.apache.org/licenses/LICENSE-2.0 | |||||
* | |||||
* Unless required by applicable law or agreed to in writing, software | |||||
* distributed under the License is distributed on an "AS IS" BASIS, | |||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||||
* See the License for the specific language governing permissions and | |||||
* limitations under the License. | |||||
* ============================================================= | |||||
*/ | |||||
""" | |||||
import threading | |||||
import time | |||||
import common.RedisUtil as f | |||||
import luascript.starttaskscript as start_script | |||||
import common.config as config | |||||
import logging | |||||
import imgprocess | |||||
logging.basicConfig(format='%(asctime)s - %(pathname)s[line:%(lineno)d] - %(levelname)s: %(message)s', | |||||
level=logging.DEBUG) | |||||
if __name__ == '__main__': | |||||
"""Enhancement algorithm entry.""" | |||||
jsonData = config.loadJsonData(config.configPath) | |||||
redisClient = f.getRedisConnection(jsonData["ip"], jsonData["port"], jsonData["database"], jsonData["password"]) | |||||
logging.info('init redis client %s', redisClient) | |||||
t = threading.Thread(target=imgprocess.delayKeyThread, args=(redisClient,)) | |||||
t.setDaemon(True) | |||||
t.start() | |||||
while 1: | |||||
try: | |||||
if config.loadJsonData(config.sign) == 0: | |||||
logging.info('not to execute new task') | |||||
time.sleep(5) | |||||
else: | |||||
enhanceTaskId = redisClient.eval(start_script.startTaskLua, 1, config.imgProcessTaskQueue, | |||||
config.imgProcessStartQueue, int(time.time())) | |||||
if len(enhanceTaskId) > 0: | |||||
imgprocess.start_enhance_task(enhanceTaskId, redisClient) | |||||
else: | |||||
logging.info('task queue is empty.') | |||||
time.sleep(5) | |||||
except Exception as e: | |||||
logging.error('except:', e) | |||||
time.sleep(1) |
@@ -0,0 +1,77 @@ | |||||
""" | |||||
/** | |||||
* Copyright 2020 Zhejiang Lab. All Rights Reserved. | |||||
* | |||||
* Licensed under the Apache License, Version 2.0 (the "License"); | |||||
* you may not use this file except in compliance with the License. | |||||
* You may obtain a copy of the License at | |||||
* | |||||
* http://www.apache.org/licenses/LICENSE-2.0 | |||||
* | |||||
* Unless required by applicable law or agreed to in writing, software | |||||
* distributed under the License is distributed on an "AS IS" BASIS, | |||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||||
* See the License for the specific language governing permissions and | |||||
* limitations under the License. | |||||
* ============================================================= | |||||
*/ | |||||
""" | |||||
# coding:utf-8 | |||||
import os | |||||
import json | |||||
import threading | |||||
import time | |||||
import common.RedisUtil as f | |||||
import common.config as config | |||||
import luascript.starttaskscript as start_script | |||||
import logging | |||||
import traceback | |||||
import ofrecord | |||||
logging.basicConfig(format='%(asctime)s - %(pathname)s[line:%(lineno)d] - %(levelname)s: %(message)s',level=logging.DEBUG) | |||||
basePath = '/nfs/' | |||||
descPath = 'ofrecord/train' | |||||
if __name__ == '__main__': | |||||
"""Ofrecord algorithm entry.""" | |||||
jsonData = config.loadJsonData(config.configPath) | |||||
redisClient = f.getRedisConnection(jsonData["ip"], jsonData["port"], jsonData["database"], jsonData["password"]) | |||||
logging.info('init redis client %s', redisClient) | |||||
t = threading.Thread(target=ofrecord.delayKeyThread, args=(redisClient,)) | |||||
t.setDaemon(True) | |||||
t.start() | |||||
while 1: | |||||
try: | |||||
if config.loadJsonData(config.sign) == 0: | |||||
logging.info('not to execute new task') | |||||
time.sleep(1) | |||||
else: | |||||
element = redisClient.eval(start_script.startTaskLua, 1, config.ofrecordTaskQueue, | |||||
config.ofrecordStartQueue, int(time.time())) | |||||
if len(element) > 0: | |||||
key = element[0].decode() | |||||
detail = f.getByKey(redisClient, key.replace('"', '')) | |||||
jsonStr = json.loads(detail.decode()) | |||||
label_map = {} | |||||
index = 0 | |||||
for item in jsonStr["datasetLabels"].keys(): | |||||
if index >= 0 and item != '@type': | |||||
label_map[item] = jsonStr["datasetLabels"][item] | |||||
index += 1 | |||||
ofrecord.execute(os.path.join(basePath, jsonStr["datasetPath"]), | |||||
os.path.join(basePath, jsonStr["datasetPath"], descPath), | |||||
label_map, | |||||
jsonStr["files"], | |||||
jsonStr["partNum"], | |||||
element[0]) | |||||
logging.info('save result to redis') | |||||
f.pushToQueue(redisClient, config.ofrecordFinishQueue, key) | |||||
redisClient.zrem(config.ofrecordStartQueue, element[0]) | |||||
else: | |||||
logging.info('task queue is empty.') | |||||
time.sleep(2) | |||||
except Exception as e: | |||||
logging.error('except:', e) | |||||
redisClient.zrem(config.ofrecordStartQueue, element[0]) | |||||
time.sleep(1) |
@@ -0,0 +1,60 @@ | |||||
""" | |||||
/** | |||||
* Copyright 2020 Zhejiang Lab. All Rights Reserved. | |||||
* | |||||
* Licensed under the Apache License, Version 2.0 (the "License"); | |||||
* you may not use this file except in compliance with the License. | |||||
* You may obtain a copy of the License at | |||||
* | |||||
* http://www.apache.org/licenses/LICENSE-2.0 | |||||
* | |||||
* Unless required by applicable law or agreed to in writing, software | |||||
* distributed under the License is distributed on an "AS IS" BASIS, | |||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||||
* See the License for the specific language governing permissions and | |||||
* limitations under the License. | |||||
* ============================================================= | |||||
*/ | |||||
""" | |||||
# coding:utf-8 | |||||
import threading | |||||
import time | |||||
import common.RedisUtil as f | |||||
import common.config as config | |||||
import luascript.starttaskscript as start_script | |||||
import logging | |||||
import track | |||||
logging.basicConfig(format='%(asctime)s - %(pathname)s[line:%(lineno)d] - %(levelname)s: %(message)s', | |||||
level=logging.DEBUG) | |||||
if __name__ == '__main__': | |||||
"""Track algorithm entry.""" | |||||
jsonData = config.loadJsonData(config.configPath) | |||||
redisClient = f.getRedisConnection(jsonData["ip"], jsonData["port"], jsonData["database"], jsonData["password"]) | |||||
logging.info('init redis client %s', redisClient) | |||||
t = threading.Thread(target=track.delayKeyThread, args=(redisClient,)) | |||||
t.setDaemon(True) | |||||
t.start() | |||||
while 1: | |||||
try: | |||||
if config.loadJsonData(config.sign) == 0: | |||||
logging.info('not to execute new task') | |||||
time.sleep(1) | |||||
else: | |||||
logging.info('get one task') | |||||
element = redisClient.eval(start_script.startTaskLua, 1, config.trackTaskQueue, | |||||
config.trackStartQueue, int(time.time())) | |||||
if len(element) > 0: | |||||
key = element[0].decode() | |||||
jsonStr = f.getByKey(redisClient, key.replace('"', '')); | |||||
if track.trackProcess(jsonStr, element[0]): | |||||
f.pushToQueue(redisClient, config.trackFinishQueue, key) | |||||
redisClient.zrem(config.trackStartQueue, element[0]) | |||||
logging.info('success') | |||||
else: | |||||
logging.info('task queue is empty.') | |||||
time.sleep(1) | |||||
except Exception as e: | |||||
logging.error('except:', e) | |||||
time.sleep(1) |
@@ -0,0 +1,62 @@ | |||||
""" | |||||
/** | |||||
* Copyright 2020 Zhejiang Lab. All Rights Reserved. | |||||
* | |||||
* Licensed under the Apache License, Version 2.0 (the "License"); | |||||
* you may not use this file except in compliance with the License. | |||||
* You may obtain a copy of the License at | |||||
* | |||||
* http://www.apache.org/licenses/LICENSE-2.0 | |||||
* | |||||
* Unless required by applicable law or agreed to in writing, software | |||||
* distributed under the License is distributed on an "AS IS" BASIS, | |||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||||
* See the License for the specific language governing permissions and | |||||
* limitations under the License. | |||||
* ============================================================= | |||||
*/ | |||||
""" | |||||
import json | |||||
import threading | |||||
from datetime import datetime | |||||
import time | |||||
import common.RedisUtil as f | |||||
import luascript.starttaskscript as start_script | |||||
import common.config as config | |||||
import logging | |||||
import videosample | |||||
logging.basicConfig(format='%(asctime)s - %(pathname)s[line:%(lineno)d] - %(levelname)s: %(message)s', | |||||
level=logging.DEBUG) | |||||
if __name__ == '__main__': | |||||
"""VideoSample algorithm entry.""" | |||||
jsonData = config.loadJsonData(config.configPath) | |||||
redisClient = f.getRedisConnection(jsonData["ip"], jsonData["port"], jsonData["database"], jsonData["password"]) | |||||
logging.info('init redis client %s', redisClient) | |||||
t = threading.Thread(target=videosample.delayKeyThread, args=(redisClient,)) | |||||
t.setDaemon(True) | |||||
t.start() | |||||
while 1: | |||||
try: | |||||
if config.loadJsonData(config.sign) == 0: | |||||
logging.info('not to execute new task') | |||||
time.sleep(5) | |||||
else: | |||||
logging.info("read redis:" + datetime.now().strftime("B%Y-%m-%d %H:%M:%S")) | |||||
sampleTask = redisClient.eval(start_script.startTaskLua, 1, config.videoPendingQueue, | |||||
config.videoStartQueue, int(time.time())) | |||||
logging.info(int(time.time())) | |||||
if len(sampleTask) > 0: | |||||
datasetId = json.loads(sampleTask[0])['datasetIdKey'] | |||||
taskParameters = json.loads(redisClient.get("videoSample:" + str(datasetId))) | |||||
path = taskParameters['path'] | |||||
frameList = taskParameters['frames'] | |||||
videosample.sampleProcess(datasetId, path, frameList, redisClient) | |||||
else: | |||||
logging.info('task queue is empty.') | |||||
time.sleep(5) | |||||
except Exception as e: | |||||
logging.error('except:', e) | |||||
time.sleep(1) |
@@ -0,0 +1,45 @@ | |||||
""" | |||||
/** | |||||
* Copyright 2020 Zhejiang Lab. All Rights Reserved. | |||||
* | |||||
* Licensed under the Apache License, Version 2.0 (the "License"); | |||||
* you may not use this file except in compliance with the License. | |||||
* You may obtain a copy of the License at | |||||
* | |||||
* http://www.apache.org/licenses/LICENSE-2.0 | |||||
* | |||||
* Unless required by applicable law or agreed to in writing, software | |||||
* distributed under the License is distributed on an "AS IS" BASIS, | |||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||||
* See the License for the specific language governing permissions and | |||||
* limitations under the License. | |||||
* ============================================================= | |||||
*/ | |||||
""" | |||||
# coding:utf-8 | |||||
import time | |||||
import sys | |||||
sys.path.append(r"./common") | |||||
import predict_with_print_box as yolo_demo | |||||
from common.log_config import setup_log | |||||
label_log = setup_log('dev', 'label.log') | |||||
def _init(): | |||||
print('init yolo_obj') | |||||
global yolo_obj | |||||
yolo_obj = yolo_demo.YoloInference(label_log) | |||||
def _annotation(type_, image_path_list, id_list, label_list, coco_flag=0): | |||||
"""Perform automatic annotation task.""" | |||||
image_num = len(image_path_list) | |||||
if image_num < 16: | |||||
for i in range(16 - image_num): | |||||
image_path_list.append(image_path_list[0]) | |||||
id_list.append(id_list[0]) | |||||
image_num = len(image_path_list) | |||||
annotations = yolo_obj.yolo_inference(type_, id_list, image_path_list, label_list, coco_flag) | |||||
return annotations[0:image_num] |
@@ -0,0 +1,42 @@ | |||||
""" | |||||
/** | |||||
* Copyright 2020 Zhejiang Lab. All Rights Reserved. | |||||
* | |||||
* Licensed under the Apache License, Version 2.0 (the "License"); | |||||
* you may not use this file except in compliance with the License. | |||||
* You may obtain a copy of the License at | |||||
* | |||||
* http://www.apache.org/licenses/LICENSE-2.0 | |||||
* | |||||
* Unless required by applicable law or agreed to in writing, software | |||||
* distributed under the License is distributed on an "AS IS" BASIS, | |||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||||
* See the License for the specific language governing permissions and | |||||
* limitations under the License. | |||||
* ============================================================= | |||||
*/ | |||||
""" | |||||
# coding:utf-8 | |||||
import redis | |||||
import sys | |||||
def getRedisConnection(host, port, db, password): | |||||
return redis.Redis(host=host, port=port, db=db, password=password) | |||||
def getOneMinScoreElement(f, queue): | |||||
return f.zrangebyscore(queue, 0, sys.maxsize, 0, 1) | |||||
def deleteElement(f, queue, element): | |||||
f.zrem(queue, element) | |||||
# get bu key | |||||
def getByKey(f, key): | |||||
print(key) | |||||
return f.get(key); | |||||
def pushToQueue(f, key, value): | |||||
f.rpush(key, value) |
@@ -0,0 +1,69 @@ | |||||
""" | |||||
/** | |||||
* Copyright 2020 Zhejiang Lab. All Rights Reserved. | |||||
* | |||||
* Licensed under the Apache License, Version 2.0 (the "License"); | |||||
* you may not use this file except in compliance with the License. | |||||
* You may obtain a copy of the License at | |||||
* | |||||
* http://www.apache.org/licenses/LICENSE-2.0 | |||||
* | |||||
* Unless required by applicable law or agreed to in writing, software | |||||
* distributed under the License is distributed on an "AS IS" BASIS, | |||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||||
* See the License for the specific language governing permissions and | |||||
* limitations under the License. | |||||
* ============================================================= | |||||
*/ | |||||
""" | |||||
# coding:utf-8 | |||||
import json | |||||
host = '' | |||||
port = 6379 | |||||
db = 0 | |||||
password = '' | |||||
# annotation | |||||
queue = 'annotation_task_queue' | |||||
annotationStartQueue = 'annotation_processing_queue' | |||||
annotationFinishQueue = 'annotation_finished_queue' | |||||
# imagenet | |||||
imagenetTaskQueue = 'imagenet_task_queue' | |||||
imagenetStartQueue = 'imagenet_processing_queue' | |||||
imagenetFinishQueue = 'imagenet_finished_queue' | |||||
# ofrecord | |||||
ofrecordTaskQueue = 'ofrecord_task_queue' | |||||
ofrecordStartQueue = 'ofrecord_processing_queue' | |||||
ofrecordFinishQueue = 'ofrecord_finished_queue' | |||||
# track | |||||
trackTaskQueue = 'track_task_queue' | |||||
trackStartQueue = 'track_processing_queue' | |||||
trackFinishQueue = 'track_finished_queue' | |||||
# videosample | |||||
videoPendingQueue = "videoSample_unprocessed" | |||||
videoStartQueue = "videoSample_processing" | |||||
videoFinishQueue = "videoSample_finished" | |||||
videoFailedQueue = "videoSample_failed" | |||||
# imgprocess | |||||
imgProcessTaskQueue = 'imgProcess_unprocessed' | |||||
imgProcessFinishQueue = 'imgProcess_finished' | |||||
imgProcessStartQueue = "imgProcess_processing" | |||||
imgProcessFailedQueue = "imgProcess_failed" | |||||
threadCount = 5 | |||||
configPath = "/root/algorithm/config.json" | |||||
sign = "/root/algorithm/sign" | |||||
def loadJsonData(path): | |||||
with open(path, 'r', encoding='utf8') as fp: | |||||
jsonData = json.load(fp) | |||||
return jsonData |
@@ -0,0 +1,80 @@ | |||||
person | |||||
bicycle | |||||
car | |||||
motorbike | |||||
aeroplane | |||||
bus | |||||
train | |||||
truck | |||||
boat | |||||
traffic light | |||||
fire hydrant | |||||
stop sign | |||||
parking meter | |||||
bench | |||||
bird | |||||
cat | |||||
dog | |||||
horse | |||||
sheep | |||||
cow | |||||
elephant | |||||
bear | |||||
zebra | |||||
giraffe | |||||
backpack | |||||
umbrella | |||||
handbag | |||||
tie | |||||
suitcase | |||||
frisbee | |||||
skis | |||||
snowboard | |||||
sports ball | |||||
kite | |||||
baseball bat | |||||
baseball glove | |||||
skateboard | |||||
surfboard | |||||
tennis racket | |||||
bottle | |||||
wine glass | |||||
cup | |||||
fork | |||||
knife | |||||
spoon | |||||
bowl | |||||
banana | |||||
apple | |||||
sandwich | |||||
orange | |||||
broccoli | |||||
carrot | |||||
hot dog | |||||
pizza | |||||
donut | |||||
cake | |||||
chair | |||||
sofa | |||||
pottedplant | |||||
bed | |||||
diningtable | |||||
toilet | |||||
tvmonitor | |||||
laptop | |||||
mouse | |||||
remote | |||||
keyboard | |||||
cell phone | |||||
microwave | |||||
oven | |||||
toaster | |||||
sink | |||||
refrigerator | |||||
book | |||||
clock | |||||
vase | |||||
scissors | |||||
teddy bear | |||||
hair drier | |||||
toothbrush |
@@ -16,17 +16,21 @@ | |||||
* ============================================================= | * ============================================================= | ||||
*/ | */ | ||||
""" | """ | ||||
# -*- coding:utf-8 -*- | |||||
from __future__ import absolute_import | from __future__ import absolute_import | ||||
from __future__ import division | from __future__ import division | ||||
from __future__ import print_function | from __future__ import print_function | ||||
import sys | |||||
import codecs | |||||
import os | import os | ||||
import numpy as np | import numpy as np | ||||
from PIL import Image | from PIL import Image | ||||
import oneflow as flow | import oneflow as flow | ||||
from of_model.resnet_model import resnet50 | from of_model.resnet_model import resnet50 | ||||
sys.stdout = codecs.getwriter("utf-8")(sys.stdout.detach()) | |||||
def init_resnet(): | def init_resnet(): | ||||
"""Initialize ResNet with pretrained weights""" | """Initialize ResNet with pretrained weights""" | ||||
@@ -60,7 +64,7 @@ def InferenceNet(images=flow.FixedTensorDef( | |||||
def resnet_inf(image_path): | def resnet_inf(image_path): | ||||
"""The whole procedure of inference of ResNet and return the category_id and the corresponding score""" | """The whole procedure of inference of ResNet and return the category_id and the corresponding score""" | ||||
image = load_image(image_path) | |||||
image = load_image(image_path.encode('utf-8')) | |||||
predictions = InferenceNet(image).get() | predictions = InferenceNet(image).get() | ||||
clsidx = predictions.ndarray().argmax() + 161 | clsidx = predictions.ndarray().argmax() + 161 | ||||
return predictions.ndarray().max(), clsidx | return predictions.ndarray().max(), clsidx |
@@ -0,0 +1,277 @@ | |||||
""" | |||||
/** | |||||
* Copyright 2020 Zhejiang Lab. All Rights Reserved. | |||||
* | |||||
* Licensed under the Apache License, Version 2.0 (the "License"); | |||||
* you may not use this file except in compliance with the License. | |||||
* You may obtain a copy of the License at | |||||
* | |||||
* http://www.apache.org/licenses/LICENSE-2.0 | |||||
* | |||||
* Unless required by applicable law or agreed to in writing, software | |||||
* distributed under the License is distributed on an "AS IS" BASIS, | |||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||||
* See the License for the specific language governing permissions and | |||||
* limitations under the License. | |||||
* ============================================================= | |||||
*/ | |||||
""" | |||||
import json | |||||
import time | |||||
import cv2 | |||||
import numpy as np | |||||
import oneflow_yolov3 | |||||
from yolo_net import YoloPredictNet | |||||
import oneflow as flow | |||||
'''Init oneflow config''' | |||||
model_load_dir = "of_model/yolov3_model_python/" | |||||
label_to_name_file = "coco.names" | |||||
use_tensorrt = 0 | |||||
gpu_num_per_node = 1 | |||||
batch_size = 16 | |||||
image_height = 608 | |||||
image_width = 608 | |||||
flow.config.load_library(oneflow_yolov3.lib_path()) | |||||
func_config = flow.FunctionConfig() | |||||
func_config.default_distribute_strategy(flow.distribute.consistent_strategy()) | |||||
func_config.default_data_type(flow.float) | |||||
if use_tensorrt != 0: | |||||
func_config.use_tensorrt(True) | |||||
label_2_name = [] | |||||
with open(label_to_name_file, 'r') as f: | |||||
label_2_name = f.readlines() | |||||
nms = True | |||||
print("nms:", nms) | |||||
input_blob_def_dict = { | |||||
"images": flow.FixedTensorDef((batch_size, 3, image_height, image_width), dtype=flow.float), | |||||
"origin_image_info": flow.FixedTensorDef((batch_size, 2), dtype=flow.int32), | |||||
} | |||||
def xywh_2_x1y1x2y2(x, y, w, h, origin_image): | |||||
"""The format of box transform""" | |||||
x1 = (x - w / 2.) * origin_image[1] | |||||
x2 = (x + w / 2.) * origin_image[1] | |||||
y1 = (y - h / 2.) * origin_image[0] | |||||
y2 = (y + h / 2.) * origin_image[0] | |||||
return x1, y1, x2, y2 | |||||
def batch_boxes(positions, probs, origin_image_info): | |||||
"""The images postprocessing""" | |||||
batch_size = positions.shape[0] | |||||
batch_list = [] | |||||
if nms == True: | |||||
for k in range(batch_size): | |||||
box_list = [] | |||||
for i in range(1, 81): | |||||
for j in range(positions.shape[2]): | |||||
if positions[k][i][j][2] != 0 and positions[k][i][j][3] != 0 and probs[k][i][j] != 0: | |||||
x1, y1, x2, y2 = xywh_2_x1y1x2y2(positions[k][i][j][0], positions[k][i][j][1], | |||||
positions[k][i][j][2], positions[k][i][j][3], | |||||
origin_image_info[k]) | |||||
bbox = [i - 1, x1, y1, x2, y2, probs[k][i][j]] | |||||
box_list.append(bbox) | |||||
batch_list.append(np.asarray(box_list)) | |||||
else: | |||||
for k in range(batch_size): | |||||
box_list = [] | |||||
for j in range(positions.shape[1]): | |||||
for i in range(1, 81): | |||||
if positions[k][j][2] != 0 and positions[k][j][3] != 0 and probs[k][j][i] != 0: | |||||
x1, y1, x2, y2 = xywh_2_x1y1x2y2(positions[k][j][0], positions[k][j][1], positions[k][j][2], | |||||
positions[k][j][3], origin_image_info[k]) | |||||
bbox = [i - 1, x1, y1, x2, y2, probs[k][j][i]] | |||||
box_list.append(bbox) | |||||
batch_list.append(np.asarray(box_list)) | |||||
return batch_list | |||||
@flow.function(func_config) | |||||
def yolo_user_op_eval_job(images=input_blob_def_dict["images"], | |||||
origin_image_info=input_blob_def_dict["origin_image_info"]): | |||||
"""The model inference""" | |||||
yolo_pos_result, yolo_prob_result = YoloPredictNet(images, origin_image_info, trainable=False) | |||||
yolo_pos_result = flow.identity(yolo_pos_result, name="yolo_pos_result_end") | |||||
yolo_prob_result = flow.identity(yolo_prob_result, name="yolo_prob_result_end") | |||||
return yolo_pos_result, yolo_prob_result, origin_image_info | |||||
def yolo_show(image_path_list, batch_list): | |||||
"""Debug the result of Yolov3""" | |||||
font = cv2.FONT_HERSHEY_SIMPLEX | |||||
for img_path, batch in zip(image_path_list, batch_list): | |||||
result_list = batch.tolist() | |||||
img = cv2.imread(img_path) | |||||
for result in result_list: | |||||
cls = int(result[0]) | |||||
bbox = result[1:-1] | |||||
score = result[-1] | |||||
print('img_file:', img_path) | |||||
print('cls:', cls) | |||||
print('bbox:', bbox) | |||||
c = ((int(bbox[0]) + int(bbox[2])) / 2, (int(bbox[1] + int(bbox[3])) / 2)) | |||||
cv2.rectangle(img, (int(bbox[0]), int(bbox[1])), (int(bbox[2]), int(bbox[3])), (0, 255, 255), 1) | |||||
cv2.putText(img, str(cls), (int(c[0]), int(c[1])), font, 1, (0, 0, 255), 1) | |||||
result_name = img_path.split('/')[-1] | |||||
cv2.imwrite("data/results/" + result_name, img) | |||||
def resize_image(img, origin_h, origin_w, image_height, image_width): | |||||
"""The resize of image preprocessing""" | |||||
w = image_width | |||||
h = image_height | |||||
resized = np.zeros((3, image_height, image_width), dtype=np.float32) | |||||
part = np.zeros((3, origin_h, image_width), dtype=np.float32) | |||||
w_scale = (float)(origin_w - 1) / (w - 1) | |||||
h_scale = (float)(origin_h - 1) / (h - 1) | |||||
for c in range(w): | |||||
if c == w - 1 or origin_w == 1: | |||||
val = img[:, :, origin_w - 1] | |||||
else: | |||||
sx = c * w_scale | |||||
ix = int(sx) | |||||
dx = sx - ix | |||||
val = (1 - dx) * img[:, :, ix] + dx * img[:, :, ix + 1] | |||||
part[:, :, c] = val | |||||
for r in range(h): | |||||
sy = r * h_scale | |||||
iy = int(sy) | |||||
dy = sy - iy | |||||
val = (1 - dy) * part[:, iy, :] | |||||
resized[:, r, :] = val | |||||
if r == h - 1 or origin_h == 1: | |||||
continue | |||||
resized[:, r, :] = resized[:, r, :] + dy * part[:, iy + 1, :] | |||||
return resized | |||||
def batch_image_preprocess_v2(img_path_list, image_height, image_width): | |||||
"""The images preprocessing""" | |||||
result_list = [] | |||||
origin_info_list = [] | |||||
for img_path in img_path_list: | |||||
img = cv2.imread(img_path, cv2.IMREAD_COLOR) | |||||
img = img.transpose(2, 0, 1).astype(np.float32) # hwc->chw | |||||
img = img / 255 # /255 | |||||
img[[0, 1, 2], :, :] = img[[2, 1, 0], :, :] # bgr2rgb | |||||
w = image_width | |||||
h = image_height | |||||
origin_h = img.shape[1] | |||||
origin_w = img.shape[2] | |||||
new_w = origin_w | |||||
new_h = origin_h | |||||
if w / origin_w < h / origin_h: | |||||
new_w = w | |||||
new_h = origin_h * w // origin_w | |||||
else: | |||||
new_h = h | |||||
new_w = origin_w * h // origin_h | |||||
resize_img = resize_image(img, origin_h, origin_w, new_h, new_w) | |||||
dw = (w - new_w) // 2 | |||||
dh = (h - new_h) // 2 | |||||
padh_before = int(dh) | |||||
padh_after = int(h - new_h - padh_before) | |||||
padw_before = int(dw) | |||||
padw_after = int(w - new_w - padw_before) | |||||
result = np.pad(resize_img, pad_width=((0, 0), (padh_before, padh_after), (padw_before, padw_after)), | |||||
mode='constant', constant_values=0.5) | |||||
origin_image_info = [origin_h, origin_w] | |||||
result_list.append(result) | |||||
origin_info_list.append(origin_image_info) | |||||
results = np.asarray(result_list).astype(np.float32) | |||||
origin_image_infos = np.asarray(origin_info_list).astype(np.int32) | |||||
return results, origin_image_infos | |||||
def coco_format(type_, id_list, file_list, result_list, label_list, coco_flag=0): | |||||
"""Transform the annotations to coco format""" | |||||
annotations = [] | |||||
for i, result in enumerate(result_list): | |||||
temp = {} | |||||
id_name = id_list[i] | |||||
file_path = file_list[i] | |||||
temp['id'] = id_name | |||||
temp['annotation'] = [] | |||||
im = cv2.imread(file_path) | |||||
height, width, _ = im.shape | |||||
if result.shape[0] == 0: | |||||
temp['annotation'] = json.dumps(temp['annotation']) | |||||
annotations.append(temp) | |||||
continue | |||||
else: | |||||
for j in range(result.shape[0]): | |||||
cls_id = int(result[j][0]) + 1 + coco_flag | |||||
x1 = result[j][1] | |||||
x2 = result[j][3] | |||||
y1 = result[j][2] | |||||
y2 = result[j][4] | |||||
score = result[j][5] | |||||
width = max(0, x2 - x1) | |||||
height = max(0, y2 - y1) | |||||
if cls_id in label_list: | |||||
temp['annotation'].append({ | |||||
'area': width * height, | |||||
'bbox': [x1, y1, width, height], | |||||
'category_id': cls_id, | |||||
'iscrowd': 0, | |||||
'segmentation': [[x1, y1, x2, y1, x2, y2, x1, y2]], | |||||
'score': score | |||||
}) | |||||
if type_ == 2 and len(temp['annotation']) > 0: | |||||
temp['annotation'] = [temp['annotation'][0]] | |||||
temp['annotation'][0].pop('area') | |||||
temp['annotation'][0].pop('bbox') | |||||
temp['annotation'][0].pop('iscrowd') | |||||
temp['annotation'][0].pop('segmentation') | |||||
temp['annotation'] = json.dumps(temp['annotation']) | |||||
annotations.append(temp) | |||||
return annotations | |||||
class YoloInference(object): | |||||
"""Yolov3 detection inference""" | |||||
def __init__(self, label_log): | |||||
self.label_log = label_log | |||||
flow.config.gpu_device_num(gpu_num_per_node) | |||||
flow.env.ctrl_port(9789) | |||||
check_point = flow.train.CheckPoint() | |||||
if not model_load_dir: | |||||
check_point.init() | |||||
else: | |||||
check_point.load(model_load_dir) | |||||
print("Load check_point success") | |||||
self.label_log.info("Load check_point success") | |||||
def yolo_inference(self, type_, id_list, image_path_list, label_list, coco_flag=0): | |||||
annotations = [] | |||||
try: | |||||
if len(image_path_list) == 16: | |||||
t0 = time.time() | |||||
images, origin_image_info = batch_image_preprocess_v2(image_path_list, image_height, image_width) | |||||
yolo_pos, yolo_prob, origin_image_info = yolo_user_op_eval_job(images, origin_image_info).get() | |||||
batch_list = batch_boxes(yolo_pos, yolo_prob, origin_image_info) | |||||
annotations = coco_format(type_, id_list, image_path_list, batch_list, label_list, coco_flag) | |||||
t1 = time.time() | |||||
print('t1-t0:', t1 - t0) | |||||
except: | |||||
print("Forward Error") | |||||
self.label_log.error("Forward Error") | |||||
for i, image_path in enumerate(image_path_list): | |||||
temp = {} | |||||
id_name = id_list[i] | |||||
temp['id'] = id_name | |||||
temp['annotation'] = [] | |||||
temp['annotation'] = json.dumps(temp['annotation']) | |||||
annotations.append(temp) | |||||
return annotations |
@@ -0,0 +1,80 @@ | |||||
person | |||||
bicycle | |||||
car | |||||
motorbike | |||||
aeroplane | |||||
bus | |||||
train | |||||
truck | |||||
boat | |||||
traffic light | |||||
fire hydrant | |||||
stop sign | |||||
parking meter | |||||
bench | |||||
bird | |||||
cat | |||||
dog | |||||
horse | |||||
sheep | |||||
cow | |||||
elephant | |||||
bear | |||||
zebra | |||||
giraffe | |||||
backpack | |||||
umbrella | |||||
handbag | |||||
tie | |||||
suitcase | |||||
frisbee | |||||
skis | |||||
snowboard | |||||
sports ball | |||||
kite | |||||
baseball bat | |||||
baseball glove | |||||
skateboard | |||||
surfboard | |||||
tennis racket | |||||
bottle | |||||
wine glass | |||||
cup | |||||
fork | |||||
knife | |||||
spoon | |||||
bowl | |||||
banana | |||||
apple | |||||
sandwich | |||||
orange | |||||
broccoli | |||||
carrot | |||||
hot dog | |||||
pizza | |||||
donut | |||||
cake | |||||
chair | |||||
sofa | |||||
pottedplant | |||||
bed | |||||
diningtable | |||||
toilet | |||||
tvmonitor | |||||
laptop | |||||
mouse | |||||
remote | |||||
keyboard | |||||
cell phone | |||||
microwave | |||||
oven | |||||
toaster | |||||
sink | |||||
refrigerator | |||||
book | |||||
clock | |||||
vase | |||||
scissors | |||||
teddy bear | |||||
hair drier | |||||
toothbrush |
@@ -0,0 +1,93 @@ | |||||
""" | |||||
/** | |||||
* Copyright 2020 Zhejiang Lab. All Rights Reserved. | |||||
* | |||||
* Licensed under the Apache License, Version 2.0 (the "License"); | |||||
* you may not use this file except in compliance with the License. | |||||
* You may obtain a copy of the License at | |||||
* | |||||
* http://www.apache.org/licenses/LICENSE-2.0 | |||||
* | |||||
* Unless required by applicable law or agreed to in writing, software | |||||
* distributed under the License is distributed on an "AS IS" BASIS, | |||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||||
* See the License for the specific language governing permissions and | |||||
* limitations under the License. | |||||
* ============================================================= | |||||
*/ | |||||
""" | |||||
#!/usr/bin/env python3 | |||||
# -*- coding: utf-8 -*- | |||||
import sched | |||||
import sys | |||||
sys.path.append(r"./common") | |||||
import logging | |||||
import time | |||||
import json | |||||
import common.of_cnn_resnet as of_cnn_resnet | |||||
import numpy as np | |||||
import luascript.delaytaskscript as delay_script | |||||
import common.config as config | |||||
from datetime import datetime | |||||
schedule = sched.scheduler(time.time, time.sleep) | |||||
base_path = "/nfs/" | |||||
delayId = "" | |||||
def _init(): | |||||
of_cnn_resnet.init_resnet() | |||||
logging.info('env init finished') | |||||
def process(task_dict, key): | |||||
"""Imagenet task method. | |||||
Args: | |||||
task_dict: imagenet task details. | |||||
key: imagenet task key. | |||||
""" | |||||
global delayId | |||||
delayId = "\"" + eval(str(key, encoding="utf-8")) + "\"" | |||||
task_dict = json.loads(task_dict) | |||||
id_list = [] | |||||
image_path_list = [] | |||||
for file in task_dict["files"]: | |||||
id_list.append(file["id"]) | |||||
image_path_list.append(base_path + file["url"]) | |||||
label_list = task_dict["labels"] | |||||
image_num = len(image_path_list) | |||||
annotations = [] | |||||
for inds in range(len(image_path_list)): | |||||
temp = {} | |||||
temp['id'] = id_list[inds] | |||||
score, ca_id = of_cnn_resnet.resnet_inf(image_path_list[inds]) | |||||
temp['annotation'] = [{'category_id': int(ca_id), 'score': np.float(score)}] | |||||
temp['annotation'] = json.dumps(temp['annotation']) | |||||
annotations.append(temp) | |||||
result = {"annotations": annotations, "task": key.decode()} | |||||
return result | |||||
def delaySchduled(inc, redisClient): | |||||
"""Delay task method. | |||||
Args: | |||||
inc: scheduled task time. | |||||
redisClient: redis client. | |||||
""" | |||||
try: | |||||
print("delay:" + datetime.now().strftime("B%Y-%m-%d %H:%M:%S")) | |||||
redisClient.eval(delay_script.delayTaskLua, 1, config.imagenetStartQueue, delayId, int(time.time())) | |||||
schedule.enter(inc, 0, delaySchduled, (inc, redisClient)) | |||||
except Exception as e: | |||||
print("delay error" + e) | |||||
def delayKeyThread(redisClient): | |||||
"""Delay task thread. | |||||
Args: | |||||
redisClient: redis client. | |||||
""" | |||||
schedule.enter(0, 0, delaySchduled, (5, redisClient)) | |||||
schedule.run() |
@@ -0,0 +1,189 @@ | |||||
""" | |||||
/** | |||||
* Copyright 2020 Zhejiang Lab. All Rights Reserved. | |||||
* | |||||
* Licensed under the Apache License, Version 2.0 (the "License"); | |||||
* you may not use this file except in compliance with the License. | |||||
* You may obtain a copy of the License at | |||||
* | |||||
* http://www.apache.org/licenses/LICENSE-2.0 | |||||
* | |||||
* Unless required by applicable law or agreed to in writing, software | |||||
* distributed under the License is distributed on an "AS IS" BASIS, | |||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||||
* See the License for the specific language governing permissions and | |||||
* limitations under the License. | |||||
* ============================================================= | |||||
*/ | |||||
""" | |||||
# !/usr/bin/env python3 | |||||
# -*- coding: utf-8 -*- | |||||
from datetime import datetime | |||||
import sched | |||||
import os | |||||
import cv2 | |||||
import numpy as np | |||||
import logging | |||||
import time | |||||
import json | |||||
import argparse | |||||
import sys | |||||
import codecs | |||||
import shutil | |||||
import luascript.delaytaskscript as delay_script | |||||
import common.config as config | |||||
from common.augment_utils.ACE import ACE_color | |||||
from common.augment_utils.dehaze import deHaze, addHaze | |||||
from common.augment_utils.hist_equalize import adaptive_hist_equalize | |||||
from common.log_config import setup_log | |||||
schedule = sched.scheduler(time.time, time.sleep) | |||||
delayId = "" | |||||
finish_key = {} | |||||
re_task_id = {} | |||||
sys.stdout = codecs.getwriter("utf-8")(sys.stdout.detach()) | |||||
# task url suffix | |||||
img_pro_url = 'api/data/datasets/' | |||||
# arguments | |||||
parser = argparse.ArgumentParser(description="config for image augmentation server") | |||||
parser.add_argument("-m", "--mode", type=str, default="test", required=False) | |||||
args = parser.parse_args() | |||||
# url concat(ip + port + suffix) | |||||
url_json = './common/config/url.json' | |||||
with open(url_json) as f: | |||||
url_dict = json.loads(f.read()) | |||||
img_pro_url = url_dict[args.mode] + img_pro_url | |||||
# creat task quene | |||||
base_path = "/nfs/" | |||||
# create log path and file | |||||
des_folder = os.path.join('./log', args.mode) | |||||
if not os.path.exists(des_folder): | |||||
os.makedirs(des_folder) | |||||
logging = setup_log(args.mode, 'enhance-' + args.mode + '.log') | |||||
enhanceTaskId = "" | |||||
def start_enhance_task(enhanceTaskId, redisClient): | |||||
"""Enhance task method. | |||||
Args: | |||||
enhanceTaskId: enhance task id. | |||||
redisClient: redis client. | |||||
""" | |||||
global delayId | |||||
detailKey = 'imgProcess:' + eval(str(enhanceTaskId[0], encoding="utf-8")) | |||||
delayId = "\"" + eval(str(enhanceTaskId[0], encoding="utf-8")) + "\"" | |||||
print(detailKey) | |||||
taskParameters = json.loads(redisClient.get(detailKey).decode()) | |||||
dataset_id = taskParameters['id'] | |||||
img_save_path = taskParameters['enhanceFilePath'] | |||||
ann_save_path = taskParameters["enhanceAnnotationPath"] | |||||
file_list = taskParameters['fileDtos'] | |||||
nums_, img_path_list, ann_path_list = img_ann_list_gen(file_list) | |||||
process_type = taskParameters['type'] | |||||
re_task_id = eval(str(enhanceTaskId[0], encoding="utf-8")) | |||||
img_process_config = [dataset_id, img_save_path, | |||||
ann_save_path, img_path_list, | |||||
ann_path_list, process_type, re_task_id] | |||||
image_enhance_process(img_process_config, redisClient) | |||||
logging.info(str(nums_) + ' images for augment') | |||||
def img_ann_list_gen(file_list): | |||||
"""Analyze the json request and convert to list""" | |||||
nums_ = len(file_list) | |||||
img_list = [] | |||||
ann_list = [] | |||||
for i in range(nums_): | |||||
img_list.append(file_list[i]['filePath']) | |||||
ann_list.append(file_list[i]['annotationPath']) | |||||
return nums_, img_list, ann_list | |||||
def image_enhance_process(img_task, redisClient): | |||||
"""The implementation of image augmentation thread""" | |||||
global img_pro_url | |||||
global finish_key | |||||
global re_task_id | |||||
logging.info('img_process server start'.center(66, '-')) | |||||
logging.info(img_pro_url) | |||||
try: | |||||
dataset_id = img_task[0] | |||||
img_save_path = img_task[1] | |||||
ann_save_path = img_task[2] | |||||
img_list = img_task[3] | |||||
ann_list = img_task[4] | |||||
method = img_task[5] | |||||
re_task_id = img_task[6] | |||||
suffix = '_enchanced_' + re_task_id | |||||
logging.info("dataset_id " + str(dataset_id)) | |||||
finish_key = {"processKey": re_task_id} | |||||
finish_data = {"id": re_task_id, | |||||
"suffix": suffix} | |||||
for j in range(len(ann_list)): | |||||
img_path = img_list[j] | |||||
ann_path = ann_list[j] | |||||
img_process(suffix, img_path, ann_path, | |||||
img_save_path, ann_save_path, method) | |||||
redisClient.lpush(config.imgProcessFinishQueue, json.dumps(finish_key, separators=(',', ':'))) | |||||
redisClient.set("imgProcess:finished:" + re_task_id, json.dumps(finish_data)) | |||||
redisClient.zrem(config.imgProcessStartQueue, "\"" + re_task_id + "\"") | |||||
logging.info('suffix:' + suffix) | |||||
logging.info("End img_process of dataset:" + str(dataset_id)) | |||||
except Exception as e: | |||||
redisClient.lpush(config.imgProcessFailedQueue, json.dumps(finish_key, separators=(',', ':'))) | |||||
redisClient.zrem(config.imgProcessStartQueue, "\"" + re_task_id + "\"") | |||||
logging.info(img_pro_url) | |||||
logging.error("Error imgProcess") | |||||
logging.error(e) | |||||
time.sleep(0.01) | |||||
def img_process(suffix, img_path, ann_path, img_save_path, ann_save_path, method_ind): | |||||
"""Process images and save in specified path""" | |||||
inds2method = {1: deHaze, 2: addHaze, 3: ACE_color, 4: adaptive_hist_equalize} | |||||
method = inds2method[method_ind] | |||||
img_raw = cv2.imdecode(np.fromfile(img_path.encode('utf-8'), dtype=np.uint8), 1) | |||||
img_suffix = os.path.splitext(img_path)[-1] | |||||
ann_name = ann_path.replace(ann_save_path, '') | |||||
if method_ind <= 3: | |||||
processed_img = method(img_raw / 255.0) * 255 | |||||
else: | |||||
processed_img = method(img_raw) | |||||
cv2.imwrite(img_save_path + ann_name + suffix + img_suffix, | |||||
processed_img.astype(np.uint8)) | |||||
shutil.copyfile(ann_path.encode('utf-8'), (ann_path + suffix).encode('utf-8')) | |||||
def delaySchduled(inc, redisClient): | |||||
"""Delay task method. | |||||
Args: | |||||
inc: scheduled task time. | |||||
redisClient: redis client. | |||||
""" | |||||
try: | |||||
logging.info("delay:" + datetime.now().strftime("B%Y-%m-%d %H:%M:%S") + ":" + delayId) | |||||
redisClient.eval(delay_script.delayTaskLua, 1, config.imgProcessStartQueue, delayId, int(time.time())) | |||||
schedule.enter(inc, 0, delaySchduled, (inc, redisClient)) | |||||
except Exception as e: | |||||
print("delay error" + e) | |||||
def delayKeyThread(redisClient): | |||||
"""Delay task thread. | |||||
Args: | |||||
redisClient: redis client. | |||||
""" | |||||
schedule.enter(0, 0, delaySchduled, (5, redisClient)) | |||||
schedule.run() |
@@ -0,0 +1,26 @@ | |||||
""" | |||||
/** | |||||
* Copyright 2020 Zhejiang Lab. All Rights Reserved. | |||||
* | |||||
* Licensed under the Apache License, Version 2.0 (the "License"); | |||||
* you may not use this file except in compliance with the License. | |||||
* You may obtain a copy of the License at | |||||
* | |||||
* http://www.apache.org/licenses/LICENSE-2.0 | |||||
* | |||||
* Unless required by applicable law or agreed to in writing, software | |||||
* distributed under the License is distributed on an "AS IS" BASIS, | |||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||||
* See the License for the specific language governing permissions and | |||||
* limitations under the License. | |||||
* ============================================================= | |||||
*/ | |||||
""" | |||||
# coding:utf-8 | |||||
delayTaskLua = """ | |||||
local element = redis.call('zscore', KEYS[1],ARGV[1]) | |||||
if element then | |||||
redis.call('zadd',KEYS[1],ARGV[2],ARGV[1]) | |||||
end | |||||
""" |
@@ -0,0 +1,27 @@ | |||||
""" | |||||
/** | |||||
* Copyright 2020 Zhejiang Lab. All Rights Reserved. | |||||
* | |||||
* Licensed under the Apache License, Version 2.0 (the "License"); | |||||
* you may not use this file except in compliance with the License. | |||||
* You may obtain a copy of the License at | |||||
* | |||||
* http://www.apache.org/licenses/LICENSE-2.0 | |||||
* | |||||
* Unless required by applicable law or agreed to in writing, software | |||||
* distributed under the License is distributed on an "AS IS" BASIS, | |||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||||
* See the License for the specific language governing permissions and | |||||
* limitations under the License. | |||||
* ============================================================= | |||||
*/ | |||||
""" | |||||
# coding:utf-8 | |||||
failedTaskLua = """ | |||||
redis.call('zrem',KEYS[1],ARGV[1]) | |||||
redis.call('lpush',KEYS[2],ARGV[2]) | |||||
redis.call('del',KEYS[3]) | |||||
redis.call('del',KEYS[4]) | |||||
return | |||||
""" |
@@ -0,0 +1,27 @@ | |||||
""" | |||||
/** | |||||
* Copyright 2020 Zhejiang Lab. All Rights Reserved. | |||||
* | |||||
* Licensed under the Apache License, Version 2.0 (the "License"); | |||||
* you may not use this file except in compliance with the License. | |||||
* You may obtain a copy of the License at | |||||
* | |||||
* http://www.apache.org/licenses/LICENSE-2.0 | |||||
* | |||||
* Unless required by applicable law or agreed to in writing, software | |||||
* distributed under the License is distributed on an "AS IS" BASIS, | |||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||||
* See the License for the specific language governing permissions and | |||||
* limitations under the License. | |||||
* ============================================================= | |||||
*/ | |||||
""" | |||||
# coding:utf-8 | |||||
finishTaskLua = """ | |||||
local queues,values=KEYS,ARGV | |||||
redis.call('zrem', queues[1], values[1]) | |||||
redis.call('lpush',queues[2],values[2]) | |||||
redis.call('del',KEYS[3]) | |||||
return | |||||
""" |
@@ -0,0 +1,30 @@ | |||||
""" | |||||
/** | |||||
* Copyright 2020 Zhejiang Lab. All Rights Reserved. | |||||
* | |||||
* Licensed under the Apache License, Version 2.0 (the "License"); | |||||
* you may not use this file except in compliance with the License. | |||||
* You may obtain a copy of the License at | |||||
* | |||||
* http://www.apache.org/licenses/LICENSE-2.0 | |||||
* | |||||
* Unless required by applicable law or agreed to in writing, software | |||||
* distributed under the License is distributed on an "AS IS" BASIS, | |||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||||
* See the License for the specific language governing permissions and | |||||
* limitations under the License. | |||||
* ============================================================= | |||||
*/ | |||||
""" | |||||
# coding:utf-8 | |||||
getTaskLua = """ | |||||
local queue = KEYS[1] | |||||
local element = redis.call('zrangebyscore', queue, 0, 9999999999999, 'limit', 0, 1) | |||||
print(element[0]) | |||||
if table.getn(element) > 0 then | |||||
print('delete this element') | |||||
redis.call('zrem', queue, element[1]) | |||||
end | |||||
return element | |||||
""" |
@@ -0,0 +1,29 @@ | |||||
""" | |||||
/** | |||||
* Copyright 2020 Zhejiang Lab. All Rights Reserved. | |||||
* | |||||
* Licensed under the Apache License, Version 2.0 (the "License"); | |||||
* you may not use this file except in compliance with the License. | |||||
* You may obtain a copy of the License at | |||||
* | |||||
* http://www.apache.org/licenses/LICENSE-2.0 | |||||
* | |||||
* Unless required by applicable law or agreed to in writing, software | |||||
* distributed under the License is distributed on an "AS IS" BASIS, | |||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||||
* See the License for the specific language governing permissions and | |||||
* limitations under the License. | |||||
* ============================================================= | |||||
*/ | |||||
""" | |||||
# coding:utf-8 | |||||
startTaskLua = """ | |||||
local queue,value,time=KEYS[1],ARGV[1],ARGV[2] | |||||
local element = redis.call('zrangebyscore', queue, 0, 9999999999999, 'limit', 0, 1) | |||||
if table.getn(element)>0 then | |||||
redis.call('zrem', queue, element[1]) | |||||
redis.call('zadd',value,time,element[1]) | |||||
end | |||||
return element | |||||
""" |