Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added tests for uploaders/pulp.py #88

Merged
merged 1 commit into from
May 6, 2024
Merged

Conversation

isudak
Copy link
Contributor

@isudak isudak commented May 6, 2024

No description provided.

@isudak isudak self-assigned this May 6, 2024
Copy link

github-actions bot commented May 6, 2024

11 passed

Code Coverage Summary

Package Line Rate
sign_node 32%
sign_node.uploaders 82%
sign_node.utils 46%
Summary 44% (373 / 849)

Linter reports

Pylint report
************* Module sign_node.uploaders.pulp
sign_node/uploaders/pulp.py:34:0: C0301: Line too long (81/80) (line-too-long)
sign_node/uploaders/pulp.py:58:0: C0301: Line too long (82/80) (line-too-long)
sign_node/uploaders/pulp.py:95:0: C0301: Line too long (88/80) (line-too-long)
sign_node/uploaders/pulp.py:135:0: C0301: Line too long (82/80) (line-too-long)
sign_node/uploaders/pulp.py:144:0: C0301: Line too long (87/80) (line-too-long)
sign_node/uploaders/pulp.py:148:0: C0301: Line too long (81/80) (line-too-long)
sign_node/uploaders/pulp.py:198:0: C0301: Line too long (90/80) (line-too-long)
sign_node/uploaders/pulp.py:200:9: W0511: TODO: Decide what to do with successfully uploaded artifacts (fixme)
sign_node/uploaders/pulp.py:146:39: R1732: Consider using 'with' for resource-allocating operations (consider-using-with)
sign_node/uploaders/pulp.py:173:4: R1710: Either all return statements in a function should return an expression, or none of them should. (inconsistent-return-statements)
sign_node/uploaders/pulp.py:197:19: W0718: Catching too general exception Exception (broad-exception-caught)

-----------------------------------
Your code has been rated at 9.05/10


Black report
--- tests/sign_node/uploaders/test_pulp.py	2024-05-06 08:06:41.850516+00:00
+++ tests/sign_node/uploaders/test_pulp.py	2024-05-06 08:07:55.352892+00:00
@@ -19,11 +19,11 @@
         self.file_map = {}
         self.file_lst = []
         self.file_paths = sorted([
             '/build_dir/package.rpm',
             '/build_dir/build.log',
-            '/build_dir/config.cfg'
+            '/build_dir/config.cfg',
         ])
 
         for file_path in self.file_paths:
             self.fs.create_file(file_path, contents=file_path)
             hsh = hash_file(file_path, hash_type="sha256")
@@ -91,10 +91,11 @@
                 response.task = TasksApi.TASK_HREF
                 return response
 
         class TasksApi:
             TASK_HREF = 'task1'
+
             def __init__(self, *_, **__):
                 pass
 
             def read(self, task_href):
                 assert task_href == self.TASK_HREF
@@ -104,11 +105,13 @@
                 return result
 
         with (
             patch('sign_node.uploaders.pulp.UploadsApi', new=UploadsApi),
             patch('sign_node.uploaders.pulp.TasksApi', new=TasksApi),
-            patch.object(PulpRpmUploader, 'check_if_artifact_exists', return_value=None)
+            patch.object(
+                PulpRpmUploader, 'check_if_artifact_exists', return_value=None
+            ),
         ):
             uploader = PulpRpmUploader('localhost', 'user', 'password', f_size)
             file_sha256, artifact_href = uploader._send_file(f_path)
             assert file_sha256 == f_hash
             assert artifact_href == f_href
--- sign_node/uploaders/pulp.py	2024-05-06 08:06:41.846516+00:00
+++ sign_node/uploaders/pulp.py	2024-05-06 08:07:55.378222+00:00
@@ -29,11 +29,13 @@
 class PulpBaseUploader(BaseUploader):
     """
     Handles uploads to Pulp server.
     """
 
-    def __init__(self, host: str, username: str, password: str, chunk_size: int):
+    def __init__(
+        self, host: str, username: str, password: str, chunk_size: int
+    ):
         """
         Initiate uploader.
 
         Parameters
         ----------
@@ -53,11 +55,13 @@
         self._file_splitter = Filesplit()
         self._chunk_size = chunk_size
         self._logger = logging.getLogger(__file__)
 
     @staticmethod
-    def _prepare_api_client(host: str, username: str, password: str) -> ApiClient:
+    def _prepare_api_client(
+        host: str, username: str, password: str
+    ) -> ApiClient:
         """
 
         Parameters
         ----------
         host : str
@@ -90,11 +94,13 @@
         result = self._tasks_client.read(task_href)
         while result.state not in ("failed", "completed"):
             time.sleep(5)
             result = self._tasks_client.read(task_href)
         if result.state == "failed":
-            raise TaskFailedError(f"task {task_href} has failed, " f"details: {result}")
+            raise TaskFailedError(
+                f"task {task_href} has failed, " f"details: {result}"
+            )
         return result
 
     def _create_upload(self, file_path: str) -> (str, int):
         """
 
@@ -130,28 +136,36 @@
         str
             Reference to the created resource.
 
         """
         file_sha256 = hash_file(file_path, hash_type="sha256")
-        response = self._uploads_client.commit(reference, {"sha256": file_sha256})
+        response = self._uploads_client.commit(
+            reference, {"sha256": file_sha256}
+        )
         task_result = self._wait_for_task_completion(response.task)
         return task_result.created_resources[0]
 
     def _put_large_file(self, file_path: str, reference: str):
         temp_dir = tempfile.mkdtemp(prefix="pulp_uploader_")
         try:
             lower_bytes_limit = 0
             total_size = os.path.getsize(file_path)
-            self._file_splitter.split(file_path, self._chunk_size, output_dir=temp_dir)
+            self._file_splitter.split(
+                file_path, self._chunk_size, output_dir=temp_dir
+            )
             manifest_path = os.path.join(temp_dir, 'fs_manifest.csv')
             for meta in csv.DictReader(open(manifest_path, 'r')):
                 split_file_path = os.path.join(temp_dir, meta['filename'])
-                upper_bytes_limit = lower_bytes_limit + int(meta['filesize']) - 1
+                upper_bytes_limit = (
+                    lower_bytes_limit + int(meta['filesize']) - 1
+                )
                 self._uploads_client.update(
                     f'bytes {lower_bytes_limit}-{upper_bytes_limit}/'
                     f'{total_size}',
-                    reference, split_file_path)
+                    reference,
+                    split_file_path,
+                )
                 lower_bytes_limit += int(meta['filesize'])
         finally:
             if temp_dir and os.path.exists(temp_dir):
                 shutil.rmtree(temp_dir)
 
@@ -193,21 +207,24 @@
         errored_uploads = []
         for artifact in self.get_artifacts_list(artifacts_dir):
             try:
                 artifacts.append(self.upload_single_file(artifact))
             except Exception as e:
-                self._logger.error("Cannot upload %s, error: %s", artifact, e, exc_info=e)
+                self._logger.error(
+                    "Cannot upload %s, error: %s", artifact, e, exc_info=e
+                )
                 errored_uploads.append(artifact)
         # TODO: Decide what to do with successfully uploaded artifacts
         #  in case of errors during upload.
         if errored_uploads:
             raise UploadError(f"Unable to upload files: {errored_uploads}")
         return artifacts
 
     def upload_single_file(
-            self, filename: str,
-            artifact_type: typing.Optional[str] = None,
+        self,
+        filename: str,
+        artifact_type: typing.Optional[str] = None,
     ) -> Artifact:
         """
 
         Parameters
         ----------

Isort report
--- /sign-node/sign_node/models.py:before	2024-05-06 08:06:41.846516
+++ /sign-node/sign_node/models.py:after	2024-05-06 08:07:55.921289
@@ -1,5 +1,4 @@
 from pydantic import BaseModel
-
 
 __all__ = ["Task", "Artifact"]
 
--- /sign-node/sign_node/uploaders/pulp.py:before	2024-05-06 08:06:41.846516
+++ /sign-node/sign_node/uploaders/pulp.py:after	2024-05-06 08:07:55.928288
@@ -1,23 +1,22 @@
 import csv
 import logging
 import os
+import shutil
 import tempfile
 import time
-import shutil
 import typing
 from typing import List
 
 from fsplit.filesplit import Filesplit
-from pulpcore.client.pulpcore.configuration import Configuration
-from pulpcore.client.pulpcore.api_client import ApiClient
+from pulpcore.client.pulpcore.api.artifacts_api import ArtifactsApi
 from pulpcore.client.pulpcore.api.tasks_api import TasksApi
 from pulpcore.client.pulpcore.api.uploads_api import UploadsApi
-from pulpcore.client.pulpcore.api.artifacts_api import ArtifactsApi
-
+from pulpcore.client.pulpcore.api_client import ApiClient
+from pulpcore.client.pulpcore.configuration import Configuration
+
+from sign_node.models import Artifact
 from sign_node.uploaders.base import BaseUploader, UploadError
 from sign_node.utils.file_utils import hash_file
-from sign_node.models import Artifact
-
 
 __all__ = ["PulpBaseUploader", "PulpRpmUploader"]
 

Bandit report
Run started:2024-05-06 08:07:56.546201

Test results:
	No issues identified.

Code scanned:
	Total lines of code: 226
	Total lines skipped (#nosec): 0
	Total potential issues skipped due to specifically being disabled (e.g., #nosec BXXX): 0

Run metrics:
	Total issues (by severity):
		Undefined: 0
		Low: 0
		Medium: 0
		High: 0
	Total issues (by confidence):
		Undefined: 0
		Low: 0
		Medium: 0
		High: 0
Files skipped (0):

View full reports on the Job Summary page.

@Korulag Korulag merged commit 8d5cede into AlmaLinux:master May 6, 2024
2 of 3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants