Cover Letter
Dear Editors and Reviewers,
With this letter, we would like to submit the manuscript entitled “Soft Masked Transformer for Point Cloud Processing with Skip Attention-Based Upsampling” for review in the “IEEE Transactions on Circuits and Systems for Video Technology”.
We declare that there are no conflicts of interest associated with the submission of this manuscript and all authors have approved the manuscript for publication. The work present is original research that has not been previously published, nor is it under consideration for publication elsewhere. All authors of this work affirm the authenticity and approval of the enclosed manuscript.
In this submission, our main contributions can be categorized into three aspects:
1) We propose a Soft Masked Transformer block, which integrates task-level information into the attention mechanism, enhancing its effectiveness for downstream tasks such as semantic segmentation and classification.
2) We introduce a Skip Attention-based Up-sampling block, which dynamically combines features from different resolution points across the encoding layers, improving the model’s ability to capture contextual information.
3) We present a shared position encoding strategy, which reduces network parameters by 24.3% and training time by 33.3%. This strategy enhances the efficiency of the network without sacrificing performance.
Thank you very much for affording us the opportunity to submit our manuscript for review. We look forward to receiving your valuable comments on this submission, and hope it will be deemed suitable for publication in the IEEE Transactions on Circuits and Systems for Video Technology. Should you have any concerns or questions regarding this manuscript, please feel free to contact us. We are here to assist you without hesitation.
Yours sincerely,
Authors,
E-mail: [email protected]