Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
Databases Programming Software IT

Keeping Customer From Accessing My Database? 567

cyteen02 writes "We run a data processing and tracking system for a customer in the UK. We provide a simple Web site where the customer can display the tracking data held in our Oracle database. From these screens they can query based on a combination of 15 different data fields, so it's pretty flexible. We also provide a csv report overnight of the previous day's data processing, which they can load into their own SQL Server database and produce whatever reports they want. Occasionally they also want one-off specific detailed reports, so we write the SQL for that and send them the results in an Excel format spreadsheet. This all ticks along happily. However they have now asked for direct read-only access to our Oracle database, to be able to run ad-hoc queries without consulting us. As a DBA, my heart sinks at the thought of amateurs pawing through my database. Unfortunately, 'because you are stupid' is not considered a valid business reason to reject their request. So can any Slashdotters assist me in building my case to restrict access? Have you experienced a similar situation? Have you had to support this sort of end user access? How would you advice me to keep my customer away from my precious tables?"
This discussion has been archived. No new comments can be posted.

Keeping Customer From Accessing My Database?

Comments Filter:
  • by ShieldW0lf ( 601553 ) on Friday May 16, 2008 @01:41PM (#23436626) Journal
    You could always put together a demonstration, in which you illustrate how easily an unskilled user can issue the wrong query and bring the server to its knees.
  • Liability (Score:2, Informative)

    by LuminaireX ( 949185 ) on Friday May 16, 2008 @01:45PM (#23436728)
    First of all, I am not a lawyer. YMMV, etc...

    Not sure what sort of information they're requesting access to, but if any data at all is of a sensitive nature, by supplying a password to them even in a read-only capacity increases your liability if that information is stolen through unauthorized access. Basically, if a third-party somehow acquires that username/password to steal a dump of every firstName, lastName, DOB, and social security number from the database, you could expose yourself to legal trouble for enabling the access in the first place.

    I would err on the side of caution and explain you would rather third-party access not exist at all than open yourself up to the risks of unauthorized use of credentials.
  • by Builder ( 103701 ) on Friday May 16, 2008 @01:47PM (#23436776)
    Actually, it can be a huge deal - badly written read queries can bring a database to its knees, slowing it down for the critical business writes.
  • by stoolpigeon ( 454276 ) * <bittercode@gmail> on Friday May 16, 2008 @01:48PM (#23436816) Homepage Journal
    It is not difficult to make this impossible- oracle allows for limiting resource consumption by user among other things.
  • by jedidiah ( 1196 ) on Friday May 16, 2008 @01:50PM (#23436860) Homepage
    ...although this is pretty trivial to avoid in Oracle. You just need to make sure
    that the adhoc query account has limited access and limited resource priveleges.
    However, the customer is bound to complain when Oracle will refuse to run their
    heinous query.
  • by Anonymous Coward on Friday May 16, 2008 @01:53PM (#23436936)
    Use Oracle Data Guard to replicate the database and let them query on the rerplicated database.

    Use Oracle RAC and have your application use one server on the node and they can use the other.

    Use CPU_PER_CALL to limit their impact.

    Honestly this isn't a Slashdot question you need to be on somewhere like

    You have many options to allow these users the access they desire but so far no strong reasons to deny them. So go for budget. Write up a RAC or Data Guard proposal, show them the additional initial and then ongoing cost, and let that put them off.
  • Re:Why? (Score:5, Informative)

    by ZeroConcept ( 196261 ) on Friday May 16, 2008 @01:53PM (#23436940)
    1) Who is responsible to change the customer queries when the schema changes and their reports no longer work?
    2) Who is liable if the customer queries affect the performance of other processes/services (lack of index usage, expensive queries, etc)?
  • by arnie_apesacrappin ( 200185 ) on Friday May 16, 2008 @01:53PM (#23436962)
    If you really want to make a case to management, write out all the worst case scenarios for your management (I know that there should only be one WORST case scenario, but work with me here). If you can list the things that can go wrong, you might be able to help. I once worked in an environment where a developer wrote a query that locked a database for four hours until we killed it. One of the DBAs was able to re-write his query so that it completed in under 10 seconds. His access was then revoked.
    Tell you boss how someone with read only access could affect the rest of the users, and you should be on the right track.
  • Re:Why? (Score:3, Informative)

    by Culture20 ( 968837 ) on Friday May 16, 2008 @01:55PM (#23437000)
    The worst thing you can do with your first "no" is provide a reason. Then they will focus on this reason until they find a chink in your armor, and you'll come up with another reason and they'll get angry.

    First, just say "no".
    Only if they demand an answer, give them as many reasonable reasons as you can think of, all at once, saying "no external connections; security policy" (only if true; and it should be... just because your DB has passwords doesn't mean it doesn't have vulnerabilities) and/or "potentially random queries against the production database can degrade performance" (again, only if true). Both are strong enough but vague enough that the client should be quiet about it.
    Always attempt to couch reasons as near to what the client wants as possible. They want their DB safe from prying eyes, they want their website to run well. Try not to say something like "for the security of our servers" unless you put it at the bottom of the list because they don't care about that, and if they're like most clients, they won't grasp that (yourserver.owned == theirdata.compromised)
    If they're still persistent, get another job; sounds like your management doesn't have a backbone
  • by CastrTroy ( 595695 ) on Friday May 16, 2008 @02:01PM (#23437146) Homepage
    Well, you could always ship the information to another machine, and have them access it from there. You could easily do it so that it's only a couple minutes behind real time. If they really think they need the information that badly, they will pay for the cost of the extra machine. If the really can't wait for the CSV files to come in at night, then it must be pretty important for them to have the data right now, or it must be really difficult for them to manager their own copy based on the CSV files. Shipping it to another machine would allow them to do stupid queries without compromising the performance of the main system.
  • by Apache ( 14188 ) <> on Friday May 16, 2008 @02:08PM (#23437282)
    This is seriously the best option. Just write a detailed quote for set up of the server, OS installation, DB, replication, licenses, SSL tunnels required, and recurring rack space allocation and power costs that will be required.

    They will either decide not to due to costs (you win) or pony up the money (you win again).

    Remember your time is (or at least should be) billable.
  • by tinkerghost ( 944862 ) on Friday May 16, 2008 @02:16PM (#23437494) Homepage
    too simple,
    select bt.* from big_table bt, bigger_table bbt, biggest_table bbbt where order by non_indexed_column;
    Perhaps a good left outer join tossed in there to really thrash the drives.
  • by Eco-Mono ( 978899 ) on Friday May 16, 2008 @02:21PM (#23437586) Homepage
    You've already got a cross-join (probably by mistake, given that you even bothered to mention left outer as useful). I think that'd be illustration enough :)
  • by ADRA ( 37398 ) on Friday May 16, 2008 @02:30PM (#23437746)
    If the poster is really anal about having access to 'the' data but not necessarily worried about schema browsing, etc.. your suggestion is really the way to go.

    Oracle supports standby databases that can be locked into read-only modes. You can even change how often the data is updated, releiving your main access system the burden of direct customer interaction.

    Worst case scenario is that the customer chokes the box to its knees and then THEY can't access your data, but it doesn't affect your operations at all.

    Since you should most certainly be charging for this service, the profits reaped should more than pay for the small investment in hardware.

    Plus, if your main DB does go down in flames, you may be able to requisition the read-only DB as a temporary main of down time becomes excessive.
  • by tinkerghost ( 944862 ) on Friday May 16, 2008 @02:48PM (#23438030) Homepage

    why join tables anyway? if you really want to fsck your server just SELECT * FROM HUGE_TABLE t1, HUGE_TABLE t2, HUGE_TABLE t3;
    Because the unbounded join on bbbt is the most likely type of problem you'll see in letting other people write their own sql statements, so it stands as a perfect example as to why they shouldn't be allowed to do it.
  • by Anonymous Coward on Friday May 16, 2008 @02:52PM (#23438110)
    That kind of locking issues does not happen in Oracle due to its multi-version concurrency control strategy and locking implementation. MS SQL place locks on every row a SQL statement touches for the duration of that particular statement in order to prevent changes to those rows while the statement is running. Oracle avoid this by making "backups" (rollback/undo) of any changed rows and reads from those backups instead if needed. Locks are still used but they don't block writers. Also, MS SQL uses a "lock manager" process which keeps track of every lock in the entire database. Statements and transactions touching many rows will therfore cause the lock manager to allocate a lot of memory in order to keep track of all locks. Then it uses lock escalation techniques, i.e. changing the granularity of the locks so fewer locks are used at the expense of larger locked areas, to keep the memory usage down. Oracle does not have a lock manager (except when dealing with distributed transactions and two phase commits (2PC) across multiple databases. Oracle store the lock information in the header of the rows themselves.
  • by PRMan ( 959735 ) on Friday May 16, 2008 @03:30PM (#23438654)

    And when it came time to build version 2 and make some internal changes we couldn't because the customers had grown accustomed to the schema being a certain way.

    That's what Views are for. They keep using the "old table", which is now a view. You put the changes in the "new table". You don't need to change your code either, except in the instances that required you to change it in the first place.

  • by Anonymous Coward on Friday May 16, 2008 @04:13PM (#23439268)
    What he said.

    The customer is giving you a chance to add value, but doing so will take some work on your part to prevent negative performance impact on the production database.

    So, you charge them for the work and create a data warehouse for them, perhaps including some analytical tools to help them query/ data mine effectively. The more they do this, the more they will probably want your help to do it better.

    Everybody is happy.
  • by IdleTime ( 561841 ) on Friday May 16, 2008 @05:06PM (#23440010) Journal
    See Oracle Database Security Guide, [] If you have MetaLink access, see:
    Note 67977.1 Oracle Fine Grained Access Control - Working Examples
    Note 99250.1 Understanding Fine-Grained Access Control
    if that does not help, open a service request and select "SECURITY" as component and get some assistance.

"Everyone's head is a cheap movie show." -- Jeff G. Bone