organization of discord bot files [closed] - discord

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
I have a discord bot that is quite large and for readability, i would like to organize into folders.
I have the main file, and the settings file. i am receiving HTTP responses from the main file right now, which is quite hard to read when going through the file. Ideally, I would like the http requests and commands in a different location... Although this is mostly personal preference i figured someone could show me how theirs is organized. I also have functions that do not fit in any such as an embed builder function

You can use cogs to separate your commands into categories/files.
On you main bot file (bot.py or main.py) write the code that imports os and that loads your cogs files.
On your bot's root folder create a folder called cogs and then a file with the name you want for example utility and give the .py extension, after that the name of the file (utility.py in this case) is the one that you gonna put on the list (list marked with # list)
import discord
import os
# list
initial_extensions = [
'cogs.utilty',
]
# the code to load all your cogs
for extension in initial_extensions:
try:
client.load_extension(extension)
except Exception as e:
print(f'Failed to load extension {extension}.', file=sys.stderr)
traceback.print_exc()
client.run('token')
Now on your cogs file (utility.py) write the next code.
import discord
from discord import client
import os
from discord.ext import commands
class Utility(commands.Cog):
def __init__(self, client):
self.client = client
#commands.Cog.listener()
async def on_ready(self):
print('Bot is ready!')
Do not forget that you probably need to rewrite most of your code but it's simple things.
Hope this helps : )
Docs: https://discordpy.readthedocs.io/en/stable/ext/commands/cogs.html

Related

Where would you store the bot token for a discord.py bot? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
Where do you usually store the bot token/API key for a discord bot or any program that integrates a bot token? I just have mine stored in the main.py file. Is there an agreed upon way to store keys and tokens such as a discord bot token? My bot works fine but I would like to know if there is a specific way to store these values or if it's something preferential like using single quotes or double quotes.
I've seen one example of a discord token being stored in a separate JSON file. Is this the preferred way to store keys and tokens in?
I'll show you a quick example of what I'm talking about.
import discord
from discord.ext import commands
bot = commands.Bot(command_prefix='-', case_insensitive=True)
#bot.event
async def on_ready():
print(f"{bot.user.name} is online and ready to go! Bot id: {bot.user.id}")
bot.run('botToken')
# I have mine placed right here in the botToken value.
This quick overview should only be considered if you are working on a hobby project. If you are doing anyhting commercial, be more serious about it.
Leaving the token in your file (hardcoding it)
Pro
Quick
Contra
Your token is easily exposed if you forget that you have your token in your file and e.g. upload it to Github
Putting it in a JSON/toml/other seperate file
Pro
Still quite quick
You can gitignore the file where you are storing the token
Contra
You need to parse the token every time you need it
still insecure if someone gets ahold of your file (for example if you forget to gitignore it)
Saving your token as a system variable
Pro
Doesn't work from a different machine -> very hard for someone to get ahold of it
Quite quick
Contra
You need to be able to access system variables
Hard to containerize
These are the main "quick and dirty" options. If you have a question about how any specific option works, you may ask here. If you want to know how to implement a specific option, ask another question on Stackoverflow.

How to delete .sqlite3 file i.e. the database file in SWIFT? [duplicate]

This question already has answers here:
How do I delete a file in my apps documents directory?
(3 answers)
Closed 5 years ago.
I'm using SQLite.swift library in my iOS app for database needs. I'm using more than one database with different names. I'm doing it fine no problems. At one point I need to delete the .sqlite3 file i.e., the whole database file when it's no longer needed. I'm a newbie to Swift3. So I don't know how to delete the database. My app has a database called AccountsDB to store the number of profiles. Each profile has a database with it's name. When user deletes one profile I need to delete the database with that name too. This is my scenario. Please guide me to achieve my task. Thanks.
Use FileManager to delete the physical file from your device. I assume you know (or can retrieve) the path to the SQLite DB file, right? In that case, if url is the file URL to your file:
let fm = FileManager.default
do {
try fm.removeItem(at:url)
} catch {
NSLog("Error deleting file: \(url)")
}

Mapping Active Directory Users with Postfix [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
I created an smtp mail server and it was successfully configured using postfix, dovecot, and roundcube.
Wanting to add functionality and to get active directory users to authenticate, I chose to use pbis (http://download1.beyondtrust.com/Technical-Support/Downloads/PowerBroker-Identity-Services-Open-Edition/?Pass=True) and found that I was able to easily add to the Active Directory domain ultimately using this command after install and completing a few prerequisites:
$ ./domainjoin-cli join TEST.LOCAL testuser
where "TEST.LOCAL" is the domain in active directory and "testuser" is a user account I set up in the active directory domain.
When logging into the account on roundcube:
I use: TEST\testuser and I am able to successfully login
This required a slight change to the dovecot configuration file /etc/dovecot/conf.d/10-auth.conf and adding the "\" to the list of characters under "auth_username_chars"
I can send an e-mail to a system linux account "user" and verify receipt of that e-mail. I have to change the outgoing e-mail address from TEST\testuser#test.local to testuser#test.local because of incorrect syntax.
What I can't seem to do is send mail to the active directory account "testuser"
I get the following error when attempting this:
SMTP Error (550): Failed to add recipient "testuser#test.local" (5.1.1 <testuser#test.local>: Recipient address rejected: User unknown in local recipient table).
This seems to correspond to alias mapping but I don't know how to do that and the guides I am finding online don't seem to quite fit what I am looking to do. No, I do not have virtual mapping. The user accounts I am trying to map to are all under this directory:
/home/local/TEST/
My question is basically this: How do I map "testuser#test.local" to "TEST\testuser#test.local" in postfix?
Actually submitted this a bit prematurely because I found my answer but had to alter it to my environment to get it to work.
Following the directions on: www.electrictoolbox.com/update-postfix-virtual-alias-map was incredibly helpful.
The exception was the /etc/postfix/virtual file had to have the windows slashes in it.
Basically what I did was
Add a line to /etc/postfix/main.cf
virtual_alias_maps = hash:/etc/postfix/virtual
Created a /etc/postfix/virtual file with the following contents:
testuser#test.local TEST\\testuser
Applied the settings:
postmap /etc/postfix/virtual

Virtual hosting FTP problems with CakePHP v2.4.6 [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 6 years ago.
Improve this question
I am having some trouble with my upload CakePHP 2.4.6 to GoDaddy. Specifically the file "CakeResponse.php" doesn't want to upload correctly. When I upload it GoDaddy appends a ".gdav" extension to it, and changes the permissions to "00".
I am using FileZilla 3.7.3 and have tried renaming it (both on server and locally), which works but does nothing. And it doesn't allow me to change permissions on the file.
Command: SITE CHMOD 704 CakeResponse.php
Response: 550 Could not change perms on CakeResponse.php: Permission denied
Any thoughts on what the problem might be?
Update: I've downloaded Cake v2.4.0 and CakeResponse will still not upload. I've also tried from my Windows PC at work using FileZilla 3.7.4.1
Based on the previous comment, my guess is that gdav stands for GoDaddy antivirus...
I went ahead and called them and yes it is their new antivirus they are still working the bugs out on.
I got around the issue by using the "File Manager" website that GoDaddy provides, which allowed me to upload the file with permissions.
In hopes of helping to lead to an answer with another case study, let me share the experience I had with the same issue, but on a WordPress file:
/wp-admin/includes/ajax-actions.php
Here are my specs:
GoDaddy Deluxe Web Hosting, Linux
Cyberduck v 4.2.1
Also tried in FileZilla v 3.6.0.2 (same results)
MacBook, OS X v 10.6.8
To be clear, I also could not change permissions (although it deceptively acted like it was letting me) and renaming the file didn't help me either. I also tried uploading a new dummy file with a different name (work.php) and with different content (I think "WORK!" was all I wrote in it), which uploaded fine with normal permission (704 I think). I then tried pasting the contents of the above ajax-actions.php into it and re-uploading it. It added a .gdav to that file as well and changed the permissions to 000.
Using GoDaddy's File Manager became my workaround as well, thanks to timmsimpson (I may not have thought of that).
Yep, issue with Go Daddy Anti-virus. login to GD host control panel, go to the File Manager, rename the file and remove the .gdav extension, select the file, change the file permissions to 644.
Update: I'm not sure if this is related to gdav, but for the past 3 weeks my WordPress sites, which normally load in 2 seconds are taking 5 seconds to never loading (connection resets, lost DB connections, etc) intermittently throughout the day every day for 3 weeks now. The other thing that was implemented 3 weeks ago on GD was SiteLock. Also this time of year is infrastructure rebuild/enhancement time at GD. POD6 is currently hosed.

Exporting tables from QuickBase [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
I am assigned with the task of taking all tables and records we have within QuickBase and importing them into a new database in MS SQL Server. I am an intern so this is new to me. I have been able to export all tables except two of them into CSV files so that I can import them into SQL Server. The two tables that will not export show a QuickBase error saying that the report is too large and the maximum number of bytes in report have been exceeded. My question is can someone recommend a work around for this? And also, do the CSV files have to be located on the server to import them, rather than have them stored on my machine?
Thanks in advance for any help.
When you export to CSV files, those files download from the browser onto your local machine, NOT the server. If you are running into issues with reports being too large, the filtering workaround above is a good enough.
The other possibility is to use the QuickBase API here:
http://www.quickbase.com/api-guide/index.html
Specifically, API_DoQuery is what you want to use.
I'm not a QuickBase expert or anything, but some poking around indicates that is a limitation within QuickBase reporting. One option would be to export those tables in sections: Filter first for records older than 60 days, then for records 60 days old or newer; or some sort of filter like that that splits the tables into two or more chunks of mutually exclusive records.
When importing the CSVs with the import wizard, the wizard will give you the opportunity to navigate to the file. If you are running SSMS on your local computer (which you should be), then the file will be accessible if it's on your local machine.
You may try some existing OpenSource ETL tool to do that directly (QB->MSSQL) - i.e. without landing any intermediate CSV file.
Look for example at CloverETL (which I used to interact with QuickBase):
http://doc.cloveretl.com/documentation/UserGuide/index.jsp?topic=/com.cloveretl.gui.docs/docs/quickbaserecordreader.html
They have a Community edition (free): http://www1.cloveretl.com/community-edition

Resources