I followed softlayer-object-storage-python in order to return a list of my objects matching a specific criteria.
This code seems to just return everything in my container no matter what I put into the search
sl_storage = object_storage.get_client(
username = environment['slos_username'],
password = environment['api_key'],
auth_url = environment['auth_url']
)
# get container
sl_container = sl_storage[environment['object_container']]
# get list, the search function doesn't actually work...
containers = sl_container.search("icm10restapi-qa.zip.*")
I expect only to get back things that start with icm10restapi-qa.zip.
I also tried using ^=icm10restapi-qa.zip but no luck either.
Reviewing the method, it seems that there is not possible to filter the objects as you would like:
https://github.com/softlayer/softlayer-object-storage-python/blob/master/object_storage/client.py#L147
API Operations for Search Services
My apologizes for the inconveniences, I recommended to try filter these in your code.
Updated
This script will help to filter your objects with the name which starts as specific string
import object_storage
import pprint
# Declare username, apikey and datacenter
USERNAME = 'set me'
API_KEY = 'set me'
DATACENTER = 'https://dal05.objectstorage.softlayer.net/auth/v1.0/'
# Creating object storage connection
sl_storage = object_storage.get_httplib2_client(USERNAME, API_KEY, auth_url=DATACENTER)
# Declare name to filter
name = 'icm10restapi-qa.zip'
# Filtering
containers = sl_storage.search(name)
for container in containers['results']:
if container.__dict__['name'].startswith(name):
print(container)
Related
I need to get the full path of folders where a file is located in Google Drive. I'm getting the files themselves using the Google Drive API, but I need information about it's parent folders
I'm using the following code tothe the list of spreadsheets in a Shared Drive:
from googleapiclient import discovery
from httplib2 import Http
from oauth2client import file, client, tools
# Change the value of SCOPES to 'https://www.googleapis.com/auth/drive'
# if you want to be able to read and write to the user's Google Drive.
SCOPES = 'https://www.googleapis.com/auth/drive'
store = file.Storage('storage.json')
creds = store.get()
if not creds or creds.invalid:
flow = client.flow_from_clientsecrets('client_secret.json', SCOPES)
creds = tools.run_flow(flow, store)
DRIVE = discovery.build('drive', 'v3', http=creds.authorize(Http()))
folder_id = "1Z1GzY-D3I3qwQu3oxIW-L1a9nXgD0PXl"
query = "mimeType='application/vnd.google-apps.spreadsheet'"
query+= "and fullText contains 'CLAS' and trashed = false"
# query += " and parents in '" + folder_id + "'"
spreadsheets = []
# Initialize the page token
next_page_token = None
# Loop until all pages of results have been retrieved
while True:
# Execute the list request
response = DRIVE.files().list(
q=query,
corpora='drive',
includeItemsFromAllDrives=True,
driveId='0AEJNMySKcEzsUk9PVA',
supportsAllDrives=True,
# orderBy='folder',
pageSize=1000,
fields='nextPageToken, files(id, name, parents, mimeType, webViewLink)',
pageToken=next_page_token,
).execute()
# Append the results to the list
spreadsheets.extend(response.get('files', []))
# Check if there is another page of results
next_page_token = response.get('nextPageToken', None)
if next_page_token is None:
break
# Set the page token for the next iteration
# parameters['pageToken'] = next_page_token
# Print the number of results
print(f'Last spreadsheet found: {spreadsheets[-1]["name"]}. Number of spreadsheets: {len(spreadsheets)}')
This returns a list of dictionaries with the specified fields. I would like to know the names of the parent folders for each file, for which I'm trying:
from googleapiclient.errors import HttpError
for item in spreadsheets:
if 'parents' in item:
parent_folders_list = []
parent_id = item['parents'][0]
try:
while parent_id:
folder=DRIVE.files().get(fileId=parent_id, fields='name, id, parents').execute()
parent_folders_list.append(folder.get("parents", []))
if parent_id:
parent_id = parent_id[0]
except HttpError as error:
print('An error occurred: %s' % error)
print(f'{item["name"]} is in {parent_folders_list}')
And I've been able to identify that parent_id is correctly retrieved, and that I am able to access it, as I was able to open it in the browser. However, I get back errors 'File Not Found' for all parent_id. I wonder if the DRIVE.files().get(fileId=) is the correct way to get back a folder using the API.
Any help would be greatly appreciated.
I've got some code which uses StringSession to talk to the Telegram API using telethon.
In my unit tests, I'm trying to instantiate a mocked TelegramClient, passing it a StringSession(myvalue) object as the first parameter. The real code works fine, but I need a fake session string for 'myvalue', to use in my unit tests (where I have a mocked telegram client).
How can I create a dummy value for 'myvalue' which will successfully execute StringSession(myvalue)?
Currently, my tests are dying here:
self = <telethon.sessions.string.StringSession object at 0x7f0777492ad0>
string = 'dummyxxx'
def __init__(self, string: str = None):
super().__init__()
if string:
if string[0] != CURRENT_VERSION:
raise ValueError('Not a valid string')
string = string[1:]
ip_len = 4 if len(string) == 352 else 16
> self._dc_id, ip, self._port, key = struct.unpack(
_STRUCT_PREFORMAT.format(ip_len), StringSession.decode(string))
E struct.error: unpack requires a buffer of 275 bytes
If you don't need a valid session to start with, you can also use MemorySession instead:
from telethon.sessions import MemorySession
session = MemorySession()
# use session variable when creating the client
Someone posted an answer which helped point me in the right direction, but they later deleted it for some reason.
In case it helps anyone else, here is the code that worked for me:
import struct
import base64
from telethon.sessions import StringSession
_STRUCT_PREFORMAT = '>B{}sH256s'
CURRENT_VERSION = '1'
dc_id = 1
ip = b'\x7f\x00\x00\x01' # 127.0.0.1
port = 80
key = b'\x00' * 256
string = StringSession.encode(struct.pack(
_STRUCT_PREFORMAT.format(len(ip)),
dc_id,
ip,
port,
key
))
myvalue = CURRENT_VERSION + string
# Create the StringSession object using the dummy value to confirm it works
session = StringSession(myvalue)
print(myvalue)
I need a help with a terraform module that I've created. It works perfectly, but I need to add some automation.
I created a module which creates multiple private endpoints, but I always need to put the variable values in manually.
This is the module:
resource "azurerm_private_endpoint" "endpoint" {
for_each = try({ for endpoint in var.endpoints : endpoint.name => endpoint }, toset([]))
name = each.key
location = var.location
resource_group_name = var.resource_group_name
subnet_id = each.value.subnet_id
dynamic "private_service_connection" {
for_each = each.value.private_service_connection
content {
name = each.key
private_connection_resource_id = private_service_connection.value.private_connection_resource_id
is_manual_connection = false
subresource_names = var.subresource_name ### see values on : https://learn.microsoft.com/fr-fr/azure/private-link/private-endpoint-overview#private-link-resource
}
}
lifecycle {
ignore_changes = [
private_dns_zone_group
]
}
tags = var.tags
}
I need to have:
1 - for the private endpoint name : I need it to be automatically provided: "pendp-(the subresource_name value in lower cases- my resource_name =>(mysql server for example))"
2 - for the private connection name: I need the values to be automatically: "connection-(the subresource_name value in lower cases- my ressource_name =>(mysql server for exemple))"
3 - some automation to detect automatically the subresource_name ( if I create a private endpoint for a blob or for a mariadb or for a mysqlserver, the module should detected it.
terraform version:
terraform {
required_version = "~> 1"
required_providers {
azurerm = "~> 3.0"
}
}
The easiest way to combine values automatically would be to use the Terraform string join() function to join multiple strings together. For lower case strings, you can use the lower() function.
Some examples:
name = join("-", ["pandp", lower(var.subresource_name)])
...
name = join("-", ["connection", lower(var.subresource_name), lower(each.key)])
For your third rule, you want to use a conditional expression to determine if it's a blob, or mariadb, or mysqlserver.
In this example, we set an example_name local with a value some-blob-value if var.subresource_name contains a string that starts with "blob", and set it to something-else if the condition is false:
locals {
example_name = startswith(lower(var.subresource_name), "blob") ? "some-blob-value" : "something-else"
}
There are many options available for doing a conditional on if a value is passed to what you expect and then determine a result based on that value. What exactly you want isn't clear in the question, but hopefully this will get you pointed in the right direction.
Terraform even has several helper functions that might help you if you only need part of a string, such as startswith(), endswith(), or contains() depending on your needs.
I have an API which gets the success or error message on console.I am new to python and trying to read the response. Google throws so many examples to use subprocess but I dont want to run,call any command or sub process. I just want to read the output after below API call.
This is the response in console when success
17:50:52 | Logged in!!
This is the github link for the sdk and documentation
https://github.com/5paisa/py5paisa
This is the code
from py5paisa import FivePaisaClient
email = "myemailid#gmail.com"
pw = "mypassword"
dob = "mydateofbirth"
cred={
"APP_NAME":"app-name",
"APP_SOURCE":"app-src",
"USER_ID":"user-id",
"PASSWORD":"pw",
"USER_KEY":"user-key",
"ENCRYPTION_KEY":"enc-key"
}
client = FivePaisaClient(email=email, passwd=pw, dob=dob,cred=cred)
client.login()
In general it is bad practice to get a value from STDOUT. There are some ways but it's pretty tricky (it's not made for it). And the problem doesn't come from you but from the API which is wrongly designed, it should return a value e.g. True or False (at least) to tell you if you logged in, and they don't do it.
So, according to their documentation it is not possible to know if you're logged in, but you may be able to see if you're logged in by checking the attribute client_code in the client object.
If client.client_code is equal to something then it should be logged in and if it is equal to something else then not. You can try comparing it's value when you successfully login or when it fails (wrong credential for instance). Then you can put a condition : if it is None or False or 0 (you will have to see this by yourself) then it is failed.
Can you try doing the following with a successful and failed login:
client.login()
print(client.client_code)
Source of the API:
# Login function :
# (...)
message = res["body"]["Message"]
if message == "":
log_response("Logged in!!")
else:
log_response(message)
self._set_client_code(res["body"]["ClientCode"])
# (...)
# _set_client_code function :
def _set_client_code(self, client_code):
try:
self.client_code = client_code # <<<< That's what we want
except Exception as e:
log_response(e)
Since this questions asks how to capture "stdout" one way you can accomplish this is to intercept the log message before it hits stdout.
The minimum code to capture a log message within a Python script looks this:
#!/usr/bin/env python3
import logging
logger = logging.getLogger(__name__)
class RequestHandler(logging.Handler):
def emit(self, record):
if record.getMessage().startswith("Hello"):
print("hello detected")
handler = RequestHandler()
logger.addHandler(handler)
logger.warning("Hello world")
Putting it all together you may be able to do something like this:
import logging
from py5paisa import FivePaisaClient
email = "myemailid#gmail.com"
pw = "mypassword"
dob = "mydateofbirth"
cred={
"APP_NAME":"app-name",
"APP_SOURCE":"app-src",
"USER_ID":"user-id",
"PASSWORD":"pw",
"USER_KEY":"user-key",
"ENCRYPTION_KEY":"enc-key"
}
client = FivePaisaClient(email=email, passwd=pw, dob=dob,cred=cred)
class PaisaClient(logging.Handler):
def __init__():
self.loggedin = False # this is the variable we can use to see if we are "logged in"
def emit(self, record):
if record.getMessage().startswith("Logged in!!")
self.loggedin = True
def login():
client.login()
logging.getLogger(py5paisa) # get the logger for the py5paisa library
# tutorial here: https://betterstack.com/community/questions/how-to-disable-logging-from-python-request-library/
logging.basicConfig(handlers=[PaisaClient()], level=0, force=True)
c = PaisaClient()
c.login()
i need to add group with several members. so i use this code below
import ldap
import ldap.modlist as modlist
# Open a connection
l = ldap.initialize("ldap://localhost:389/")
# Bind/authenticate with a user with apropriate rights to add objects
l.simple_bind_s("cn=manager,dc=maxcrc,dc=com","secret")
# The dn of our new entry/object
dn="cn=semuaFK,ou=group,dc=maxcrc,dc=com"
# A dict to help build the "body" of the object
attrs = {}
attrs['objectclass'] = ['top','groupofnames']
attrs['member'] = 'cn=user1,ou=people,dc=maxcrc,dc=com'
attrs['member'] = 'cn=user2,ou=people,dc=maxcrc,dc=com'
attrs['description'] = 'ini group untuk semua dosen dokter'
# Convert our dict to nice syntax for the add-function using modlist-module
ldif = modlist.addModlist(attrs)
# Do the actual synchronous add-operation to the ldapserver
l.add_s(dn,ldif)
# Its nice to the server to disconnect and free resources when done
l.unbind_s()
adding member into group
attrs['member'] = 'cn=user1,ou=people,dc=maxcrc,dc=com'
attrs['member'] = 'cn=user2,ou=people,dc=maxcrc,dc=com'
but the result is a group with just one member. Only the user2 is added to the group. How to make a group, also add several members to it
member is a multi-valued attribute (just like objectClass). From looking at your code, the problem seems to be that you overwrite the value of the member attribute with the 2nd line. The code should be something like this:
attrs['member'] = [ 'cn=users1,ou=people,dc=maxcrc,dc=com', 'cn=users2,ou=people,dc=maxcrc,dc=com' ]`
or
attrs['member'] = [ 'cn=users1,ou=people,dc=maxcrc,dc=com' ]
attrs['member'] += 'cn=users2,ou=people,dc=maxcrc,dc=com'